(navigation image)
Home American Libraries | Canadian Libraries | Universal Library | Community Texts | Project Gutenberg | Children's Library | Biodiversity Heritage Library | Additional Collections
Search: Advanced Search
Anonymous User (login or join us)
Upload
See other formats

Full text of "Quality assurance guidelines for research and development"

Interim - May 1987 

SJVDP LIBRARY -ffV5^ri 




Quality Assurance/Quality Control Guidelines 

for 
San Joaquin Valley Drainage Program 
Investigations 

May 1987 



San Joaquin Valley Drainage Program 



These Guidelines were prepared by Dr. H. L. Young of ECOS Environmental 
and Energy Consultants, Sacramento, California, in cooperation with the 
San Joaquin Valley Drainage Prograun. 



The San Joaquin Valley Drainage Program was established in mid-1984 
auid is a cooperative effort of the O.S. Bureau of Reclamation, D.S. Fish 
and Wildlife Service, O.S. Geological Survey, California Department of 
Fish eind Game, amd California Deparbnient of Water Resources. The 
purposes of the Program are to investigate the problems associated with 
the drainage of agricultural lands in the Saji Joaquin Valley and to 
develop solutions to those problems. Consistent with these purposes, 
program objectives address the following key areas: (1) Public health, 
(2) surface- and ground-water resources, (3) agricultural productivity, 
£md (4) fish and wildlife resources. 

Inquiries concerning the San Joaquin Valley Drainage Program may be 
directed to: 



San Joaquin Valley Drainage Program 
Federal-State Interagency Study Team 
2800 Cottage Way, Room W-2143 
Sacramento, California 95825-1898 



CONTENTS 

Page 

1. Introduction 1 

2. Quality Assurance Policy and Management 5 

2.1. QA Policy 5 

2.2. Responsibilities of QA Officer (QA) 6 

2.3. QA Resources and Budget Provisions 7 

3. Data Quality Objectives 3 

3.1. Precision and Accuracy 9 

3.2. Representativeness 11 

3.3. Data Comparability 12 

3.4. Data Completeness 12 

4 . Sample Procedures 14 

4.1. Sampling Procedures 14 

4.2. Sample Containers and Holding Time 15 

4.3. Sample Custody and Documentation . 15 

4.4. Field Logbook 16 

4.5- Sample Submittal 17 

4.5. Sampling QA. ...... 18 

4.7. Reference Materials for Sampling and Field 

Measurements 19 

5. Facilities, Equipment, and Services. , 21 

5.1. In-house Facility 21 

5.2. Contracted Laboratory 22 

6. Calibration Procedures and Frequency . 23 

6.1. Field Equipment. 23 

6.2. Laboratory Equipment , 23 

7. Analytical Procedures 24 

7.1. Analytical Procedures 24 

7.2. Analytical Procedures for Air Samples 25 

7.3. Records and Documentation 26 

7.4. Reference Materials for Chemical Analysis 26 

7.5. Reference Materials for Biological Studies 27 



Page 

3. Data Reduction, Validation, and Reporting 29 

8.1. Data Reduction 29 

8.2. Data Validation 30 

8.3. Data Reporting 31 

3.4. Data Assessment 31 

8.5. Data Storage 32 

8.6. Reference Materials 33 

9 . Internal Quality Control Checks 34 

9.1. QC Checks for Sampling 34 

9.2. Laboratory QC Checks 35 

9.3. Reference Materials for Laboratory Quality Control . . 36 

10 . Performance and System Audits 38 

10.1. In-house Performance and Systen Audits 38 

10.2. External Performance and System Audits 39 

11. Specific Routine Procedures Used to Assess Data Precision, 

Accuracy, and Completeness 40 

11.1. Assessment of Precision 40 

11.2. Assessment of Accuracy 41 

11.3. Assessment of Current Data Based on Historical Data . 41 

12. Corrective Action 43 

12-1. Categories of Corrective Actions 43 

12.2. Person Responsible for Initiating Corrective Actions. 43 

13. QA Requirements for Procurement of Sampling and Analysis . . 45 

13.1 QA of Procurement Process 45 

13.2 QA Requirements of Contractors 46 

13.3 Examples of QA Requirements and QA Review Forms. ... 46 

f 

14. Training Program 51 

15. Quality Assurance Reports to Management 52 

16. Safety . . ' 53 

17 . QA Implementation Schedule 54 



APPENDICES 

1 . Typical Data Report Forms 1 . ]_ 

2. Principles of Environmental Analysis 2.1 

3 . Sample Preservation Methods and Recommended Holding Time . . . 3.1 

4. Glossary 4.I 



U 



1 . INTRODUCTION 

This document is prepared for the San Joaquin Valley Drainage 
Program (SJVDP) to provide guidelines for developing a Quality Assurance 
(QA) Program. 

San Joaquin Valley Drainage Program is designed to manage salt 
deposition and trace element contamination in the valley drainage system 
and in the west side of the San Joaquin Valley. The drainage system 
includes the San Joaquin River, San Luis Drain, canals, Kesterson 
Reservoir and San Francisco Bay Delta. Tne goals of this program are to 
protect public health , to maintain drainage water discharge in 
conformance with effluent discharge standards through proper treatment 
and disposal, to preserve agricultural land, and to maintain the natural 
support system for fish and wildlife in the valley wetlands. The success 
of the program depends ■ on reliable and accurate data in key areas; i.e., 
salt and trace metals and selenium measurements in drainage water, soil, 
air and biota. 

Scin Joaquin Valley Drainage Program is a cooperative prograun 
which involves thi^ee Federal agencies and two State agencies; namely: 
C.S, Bureau of Reclamation, U.S. Fish and Wildlife Service, U.S. 
Geological Survey, Department of Fish and Game, and Department of Water 
Resources . Intergovernmental . organizational structure is depicted in 
Figiire 1. 

The Interagency Technical Advisory Committee (ITAC) has been 
established to provide advice and overview of program activities 



Figure L. San Joaquin Valley Drainage Program Organization Structure 



INTERGOVERNMENTAL 
COORDINATION TEAM 



CITIZENS 
ADVISORY 

COMMITTEE 



POLICY AND MANAGEMENT 

COMMITTEE 



NATIONAL ACADEMY 

OF 

SCIENCES 



PROGRAM MANAGER 
AND 
STODY TEAM 



VALLEY 
BIOLOGY 

SUBCOMMITTEE 



GEOCHEMISTRY 

SUBCOMMITTEE 



INTERAGENCY 
TECHNICAL 
ADVISORY 

COMMITTEE 



SSTUARINE AND OCEAN 

BIOLOGY 
SUBCOMMITTEE 



TREATMENT AND 

DISPOSAL 
SUBCOMMITTEE 




ON- FARM 
MANAGEMENT 
SUBCOMMITTEE 



PUBLIC HEALTH 
SUBCOMMITTEE 



pertaining to the SJVDP. In addition, the National Academy of Sciences 
Connnittae on Irrigation-Induced Water Quality Problems will also provide 
an overview. In addition to these agencies, community agencies, academia 
and private sector also collect data on the Sein Joaquin Valley drainage 
system. Each of the participating agencies collects data for certain 
aspects of the program and develops individual solutions. 

To ensure data generated and used are of Icnown and scientifically 
acceptable precision and accuracy, each participating agency and 
laboratory should develop and implement a comprehensive and 
cost-effective QA program. All agencies should use uniformly 
standardized or compatible procedures for sampling and analysis, as well 
as for data evaluation and assessment. The Interagency Technical 
Advisory Committee of the San Joaquin Valley Drainage Program should 
oversee the development and implementation of the QA program by each 
participating agency. 

The primary objectives of this document are: 

Highlight and inform the SJVDP participating agencies of their 

quality assurance responsibilities in all phases of 

environmental studies , 

Document the essential quality assurance/quality control (QA/QC) 

elements to ensure data quality, 

Provide QA guidelines in planning and conducting SJVDP projects, 

and 

Reference the acceptable methodologies for field sampling and 

laboratory analysis , and the minimum performance standards . 



A comprehensive QA program should include two major components: 
namely, QA management and QC practices. The quality assurance management 
is responsible for the development of and overseeing the implementation 
of the QA program. It encompasses six elements: quality assurance 
policy amd management, data quality objectives, data assessment, 
performance and system audits, corrective action, and quality assurance 
reports- The QC practices are the daily quality control activities 
carried out by project planning staff, field personnel, as well as 
laboratory staff to ensure high quality data. QC practices should 
encompass : seunpling procedures , equipment maintenance and services , 
calibration procedures , analytical procedures , data reduction/validation/ 
reporting, internal and external quality control checks, corrective 
actions, emd participating in performance studies. These essential 
elements are discussed in the following sections. It is recommended that 
each agency follow these guidelines in developing its QA program. 

It is recognized that individual participating agencies may have 
established QA/QC protocols that provide performance standaurds acceptable 
to the Interagency Technical Advisory Committee. In such cases, 
modifications of existing QA/QC protocols will not be necessary. 
However, systemwide performance audits and interlaboratory comparison 
studies should be conducted routinely and the Interagency Technical 
Advisory Committee should ensure that the QA objectives and performance 
stauidards are net by all participants . 



2. QUALITY ASSURANCS POLICY AND MANAGEMENT 

Quality assurance is a system for integrating the quality planning, 
quality assessment and quality improveunent efforts of various groups in 
an organization to enable operations to tneet user requirements of 
standards and quality. In measurement systems, QA is concerned with all 
of the activities that have an important effect on the quality of the 
measurements, as well as the establishment of methods and techniques. 
The more authoritative usages differentiate between "quality assurance" 
and "quality control," where QC is "the system of activities to provide a 
quality product" and QA is "the system of activities to provide assurance 
that the QC system is performing adequately." 

QA management should encompass the six major activities: 

' Establish QA policy, 

' Establish data quality objectives, 

' Oversee all QC activities , 

' Conduct data quality assessment, 

" Manage audit systems , and 

' Recommend corrective actions . 

2.1. QA Policy 

It should be a policy of each agency to ensure that all data 
generated by the agency and its contractor are of known and acceptable 
quality . Known quality refers to all components associated with data 
generation and which are thoroughly documented. Data is considered to be 



of acceptable quality if it is scientifically valid, legally defensible, 
accurate, precise, representative, comparable and complete. To ensure 
the known and good quality of data, a centrally managed Quality Assurance 
Program shall be established for all monitoring activities and related 
studies. The QA program should be documented and updated each year. The 
Agency/Regional Director has the overall responsibility for the agency's 
QA program. The daily QA management should be delegated to the QA 
officer who should be a full-time anployee to manage all in-house QA 
activities and to oversee contractor • s QC practices . 

Each agency should allocate sufficient personnel and funds for all 
QA/QC activities, including various interlaboratory comparison studies, 
field and laboratory evaluations/audits . 

QA implementation schedule or QA work plan should be prepared by 
each agency at the beginning of each fiscal year. 

It is recommended that a QA training program be established. 
Individuals designing experiments and collecting samples often are 
relatively inexperienced to QC protocols. Properly conducted QC training 
for these individuals will be useful to meet program required standards. 

2.2. Responsibilities of QA Officer (QAO) 

The QAO should be independent of any ^tudy program/project and 
should report to top management. The QA Officer shall be responsible 
for; 

1. Recommend QA policy to the management. 

2. Develop and manage the agency QA program. 

3. Prepare agency QA Program Plan, and revise it as needed. 



4. Overview quality control practices for field, laboratory, 
and data management . 

5 . Develop or approve sampling and analytical procedures used in 
the agency or its contractor. 

6. Assist in developing precision and accuracy guidelines/criteria 
for data acceptance. 

7 . Review data quality in accordcince with the established 
guidelines/criteria . 

3. Conduct data quality and laboratory performance followup or 

approve data usage . 
9 . Recommend corrective actions . 

10. Coordinate all QA/QC activities. 

11. Recommend QC training for management amd technical personnel. 

2.3. QA Resources and Budget Provisions 

Each agency shall provide sufficient personnel and funding for the 
QA program necessary to ensure the data quality. At a minimum, the 
following personnel time should be designated for QA/QC activities: a 
full-time QA officer, ten to fifteen percent (10-15%) of all field and 
laboratory personnel time, and approximately twenty percent (20%) of the 
time for data reduction and storage . 

If a small laboratory has difficulty designating a full-time QA 
Officer because of limited staff,. the laboratory director/manager should 
be the designated QAO. . 

Provisions for the QA program must be considered at the early 
planning stage of a project, so that adequate funding can be included in 
the budgetary process . 



3. DATA QUALITY OBJECTIVES 

Data quality objectives (DQOs) are statements of the quality of 
data needed to support a specific decision or action. Thus, DQOs vary 
with the goal and usage of data collected. Program project managers 
should be involved in the development of DQOs for data collection 
activities . DQOs are the mechanism for balamcing the conflicting demands 
of time amd resource constraints on the one hand and the need for data of 
acceptable quality and utility on the other. 

The quality of data is generally expressed in terms of precision, 
accuracy or bias, representativeness, comparability, and completeness. 
The development of data quality objectives is a complex, iterative 
process which involves a joint responsibility of respective 
program/project manager, QA officer, technical staff, and Laboratory 
director at the planning stage. 

DQOs are developed in three general stages. First, the program/ 
project mamager states the purpose of data collection and provides some 
preliminary guidance of what the resources , time constraints , and the 
qxiality of data needed. Second, the technical staff, including QAO, then 
refines the data quality goals via discussion with the program/project 
manager. Based on the time and resources constraints stated by the 
program manager and the technological constraints of methodology used for 
sampling and analysis , the technical staff develops vaLrious approaches, 
then presents them to the manager for final decision. Third, the manager 
selects the approach best suited for the project goal. The manager will 
then understand what data quality will be. 



3.1. Precision and Accuracy 

Data precision and accuracy (P and A) are quantitative measures 
that characterize the amount of data variability caused by sample 
collection and processing, and laboratory analysis. Precision refers to 
the level of agreement among repeated measurements of the same 
characteristic. Accuracy refers to the difference between measurement 
results/data and the true value of the parameter being measured. 
Accuracy is commonly assessed by standard reference materials and/or 
spiked samples. Sampling precision and accuracy vary considerably 
depending on sampling procedures used, sampling site, sample matrix, 
homogenity of sample, etc., and are difficult to assess- P and A for 
each parameter in each matrix, e.g., selenium in water, fish, etc., must 
be individually determined in each case. The analytical P and A, on the 
other hand, are easier to define. Such information is available in most 
analytical methodologies, EPA performance studies and the like; they 
should be used as laboratory performance guidelines. It should be noted 
that analytical P and A are concentration dependent. In developing 
criteria for overall data precision and accuracy, the P and A of sample 
collection, sample handling and extraction, and analysis for a specific 
analysis and each sample media must be considered. The overall data 
precision and accuracy may be est im ated from the data of field QC samples 
submitted as blinds; i.e., replicate/duplicate field samples, field 
spikes and standard reference samples (EPA, NBS, aSGS, etc.), 
respectively. Standard deviations and 95 percent confidence limits of 
field replicate/duplicate samples, field spiked samples, reference 



saunples , etc., should be computed in 10 sets or more of such samples from 
the same or similar sources . The 95 percent confidence limits may be 
used to establish the data P cind A objectives. Until the data P and A 
objectives are established, 95 percent confidence limits of analytical P 
and A for field QC samples may be used as interim data acceptance 
guidelines . 

P and A for some common analyses of water samples are illustrated 
in Table 1. These values may assist the project managers to reestablish 
P and A requirements for individual projects. 



10 



Table 1. Precision and Accuracy of Trace Elements 
Commonly Analyzed in Water Samples 







Method 




Accuracy 




Precision 






Detection 


(% of 


Spike 


Recovery 


(RPD, % 


Analysis 


Method 
gaseous 


Limit 


Ref : 


: EPA Method) 
+ 2-2 


Diff 6 

+ 


jrence ) 


Arsenic 


2 ug/L 


10 




hydride 


















Boron 


colometric 






+ 


15 






+ 


20 


Cadmium 


furnace 


1 ug/L 




+ 


10 






+ 


10 


Chromium 


K 


1 ug/L 




+ 


10 






+ 


10 


Copper 


l« 


1 ug/L 




+ 


20 






+ 


10 


Iron 


n 


1 ug/L 




+ 


10 




+2 


+ 


10 


Lead 


n 


1 ug/L 




+ 


10 




standard 


+ 


10 


Manganese 


If 


0.2 ug/L 




+ 


10 




deviation 


J 7 


10 


Mercury 


cold vapor 


0.2 ug/L 




+ 


20 




of lab 




20 


Molybdenum 


furnace 


1 ug/L 




+ 


10 




analysis 


, + 


20 


Nickel 


n 


1 ug/L 




+ 


10 




or 


+ 


15 


Selenium 


gaseous 
hydride 






+ 


10 






; + 


20 


Silver 


furnace 


0.2 ug/L 




+ 


10 






± 


10 


Zinc 


If 


0.1 ug/L 




4- 


10 






± 


10 



It is common that P and A for soil, sediment and biological 
samples vary greater than that of water samples , due to matrix problems 
and additional extraction process . 

3.2. Representativeness 

Representativeness is a qualitative element; it refers to the 
collection of representative samples or selection of representative 
sample aliquots for analysis. Such samples truly reflect the 
characteristic of the media at the sampling point. It also includes how 
well the sampling point represents the actual parameter variations which 
are under study. Representativeness of sample depends upon sampling 
design and protocol. To ensure the data representativeness, the 
following shall be adhered to: 



11 



1. Written sampling procedures. 

2 . Predetermined sampling sites . 

3. Chain of custody procedure. 

4 . Documenting field observations . 

Several procedures such as a composite of samples from a sampling 
site, statistical rcindom sampling, auid the like can be used to improve 
representatives of sample, heterogenous samples in particular. 

3.3. Data Comparability 

Data comparability refers to the similarity of data generated by 
different groups. Considerations during planning to avoid data 
noncomparability between different agencies or between different 
analytical methods atre necessary. To ensure data comparability, 
different agencies should use uniform or comparable Siunpling procedures 
as well as the analytical methods . Interlaboratory comparison studies 
Should be conducted to ensure data comparability generated by different 
laboratories . 

Data must be reported in the units designated in the OSEPA STORET 
system, e.g., mg/L or ug/L for water samples, mg/kg or ug/kg for soils, 
sediments, or tissue samples. 

3.4. Data Completeness 



Data completeness refers to the amount of data that is successfully 
collected with respect to that amount intended in the design. A certain 
percentage of the intended data must tie successfully collected for 
conclusion based on the data acceptance criteria. 



12 



The determination of data completeness is the responsibility of the 
program/ project manager. It is expected that laboratories should provide 
data meeting QC acceptance criteria for 95 percent or more of the 
determinations requested. At the same time, it is encumbent for 
program/project planners to identify any Scimple types, such as controls 
and QC samples, which require 100 percent completeness. 



13 



4. SAMPLE PR0C2D0RES 



4.1. Sampling Procedures 

Ultimate accuracy of amy data generation begins with a sampling 
procedure that is well conceived emd implemented. Efforts shall be made 
to ensure sample representativeness. All sampling procedures for a 
specific medium and with the same objective should be consistent. Same 
procedures or compatible sampling procedures should be used by all 
participating agencies to ensure data comparabilities. It is advisable 
to adopt the sampling procedures established by another agency, e.g., 
biological Scimpling established by the Fish and Wildlife Service, U.S. 
Department of the Interior. 

Each agency should establish and document all sampling procedures 
to be used. Standard operating procedures (SOPs) for sampling must be 
•unritten in detailed, clear, simple and easy to follow language. Sampling 
procedures shall include site selection methodologies, sampling 
protocols, and requirements for sample containers, labeling, handling, 
and sample preservation and hold time. Field personnel shall adhere 
closely to the SOPs . Any unavoidable changes in procedure shall be 
recorded in detail in the field logboolc with the sampler's signature. 

A list of references for sample procedures is presented in 
section 4.7. 



14 



4.2. Sample Containers and Holding Time 

Osing properly cleaned and suitable containers and correct 
preservatives as well as adhering to proper holding time are essential 
factors for maintaining the sample representativeness. E3"A or California 
requirements for sample containers , preservation techniques and holding 
times, or ASTM standard practices should be used (refs. 5 and 6). The 
EPA requirements are published in 40 CFR Part 136, Guidelines 
Establishing Test Procedures for the Analysis of Pollutants Under the 
Clean Water Act; Final Rule and Interim Final Rule cmd Proposed Rule; 
Federal Register vol. 49, No. 209, Friday, October 26, 1984. 

Stcindard operating procedures for cleaning and preparation of 
glassware and sample containers must be prepared and strictly adhered. 

4.3. Sample Custody and Documentation 

Although most of the data are not used for enforcement purposes , 
well designed sample custody and sample submittal procedures are 
important to ensure data quality. The office responsible for sampling 
should develop such procedures, and sample custody forms (example custody 
form in Appendix ) . 

Sample custody is a vital aspect of any monitoring program 
generating data which may be used as evidence in a court of law. This 
procedure should be made part of the project plan or reference in the 
plan citing the documented standard operating procedures (SOPs) for both 
field and laboratory. 



15 



Sample custody consists of two components, documentation and actual 
physical custody of the official scimple; and two distinctive phases: 
custody in the field and in the laboratory. 

The following principles should apply to handling samples from the 
point of collection through the project completion or the completion of 
all contemplated or real legal action. The sample is considered in 
"custody" if: 

' It is in one's actual physical possession or view, 

" It is in one's physical possession and has not been tampered 

with, i.e., under lock or under official seal, 
' It is retained in a sectored area with restricted access, or 
' It is placed in a container and secured with an official 
seal (3) such that the sample cannot be reached without 
breaking the seal ( s ) . 

4.4. Field Logbook 

All information pertinent of field activities should be recorded 
in a bound notebook with consecutively numbered pages . Entries in the 
logbook should include, as applicable, the following: 

" Date and time of entry 

" Name and address of field contact 

° Responsible party and the site address 

' Type of process producing the material 

• Type of media sampled (wastewater, sediment, etc.) 
° Description of sample 

• Mumber and size of saunoies taken 



16 



' Description of sampling location 
" Date and time of sample collection 

" Sample identification number (s) and collector's name 
' References such as maps or photographs, if necessary 
* Field obseirvations 

' Any field measurements made such as pH, organic vapors, etc. 
Because Scunpling situations vary widely, notes should be as descriptive 
and inclusive as possible. Those reading the entries should be able to 
reconstruct the sampling situation from the recorded information. 
Records should be objective and factual. The person who makes the entry 
should date and sign it. 

4.5. Sample Submittal 

Each sample must be labeled using waterproof ink and sealed 
immediately after it is collected. Sample tag attached by string to 
sample container is acceptable when gxuraned labels are not available. The 
label should include the following information: naune of collector, date 
and time of collection, place of collection and sample number. 

After collection, the samples (together with sample form and chain 
of custody form) should be carefully packed and delivered to the in-house 
laboratory by the person who collected the saunples. Those samples to be 
submitted to out-of-town laboratories should be accompanied by sample 
forms and chain of custody forms and shipped through Federal Express or 
the like. 



17 



Laboratories receiving samples should prepare and follow the -rfritten 
SOPs for sample receiving and storage. Also a permanent record of the 
sample log should be kept in the lab for 5 years. 

4.6. Sampling QA 

To evaluate and to ensure acceptable precision and accuracy of 
data, 10 percent of the samples or one sample in a batch less than 10 
collected in a single day should be collected in duplicate and submitted 
to the laboratory as blinds. Also, 10 percent of the samples or one sample 
of a batch less than 10 collected in a single day will be collected in 
duplicate and one of the duplicates is spiked with known compound of 
known qu£uitity. The spiked sample is submitted to the laboratory as a 
blind along with other samples. Additionally, a standjtrd reference 
sample or a recently analyzed sample can be resubmitted as a blind with 
each batch of samples. Preferably, reference samples should be prepared 
with natural water similar to the environmental samples. 

Using spiked samples is a good technique to evaluate laboratory 
performance as well as to detect matrix interference in water samples. 
However, one should be cautious in using spiking technique on soil, 
sediment or biological samples. The chemicals spiked eind those present 
in the sample originally may act differently to Scimple preparation, 
extraction amd/or analysis. For the latter types of samples, splitting 
samples with another laboratory or using another aiethodology are 
alternate QC protocols. 



18 



4.7. Referance Materials for Sampling and Field Measurements 

Materials are available from several sources . Typical ones are 
listed below: 

1. Handbook for Sampling and Sample Preservation of Water and 
Wastewater, Environmental Monitoring and Support Laboratory , 
SPA, EPA-600/4-76-049 or the latest edition. 

2. Preparation of Soil Sampling Protocol: Techniques and 
Strategies, by Benjamin J. Meson, EPA-600/4-83-020 , August 

1983. 

3. ASTM Annual Book of Standards Part 31, D 3694, "Standard 
Practice for Preparation of Sample Containers and for 
Preparation," American Society and Materials, Philadelphia, PA, 
1980, 

4. Buchanan, T. J., and Somers, W. P., Discharge Measurements at 
Gauging Stations: O.S. Geological Survey Tec^hniques of Water 
Resources Investigations, Book 3, Chapter A8 , p. 68, 1969. 

5 . Guidelines Establishing Test Procedures for the Analysis of 

Pollutants Under the Clean Water Act; Final Rule and Interim 

Final Rule and Proposed Rule; Federal Register Vol. 49, 

No. 209, Oct. 26, 1984. (Also 40 CPR, Part 136.) 

5. Collection, Pretreatment , Storage and Transportation of Water 
and Wastewater Samples, CA Department of Health Services, 
July 1, 1985. 

7. Samplers and Seimpling Procedures for Hazardous Waste Streams. 
EPA-600/2-80-018, Jan. 1980, or latest edition. 



19 



8. Sampling for Hazardous Materials, Office of Emergency and 
Remedial Response, U.S. EPA, Cincinatti, Ohio, Nov. 1984. 

9. Water Quality Field Sampling Manual. Environmental Science 
euid Engineering, Inc., 1977, Gainesville, Florida. 

10. Sampling for Chemical Analysis, B. Krotochvil and 

John K. Taylor, Anal. Chem. , Vol. 53, 924A-938/A, 1981. 



20 



5. FACILITIES, EQUIPMENT, AND SERVICES 

Every laisoratory is responsible for the quality of data reported and 
shall prepare and implement a QA/QC plan. 

" Each laboratory should designate a Quality Control Officer to 
oversee and implement the laboratory quality control activities including: 

1. Preparation of QC samples. 

2 . Preparation of QC charts . 

3 . Review of QC data and QC activities . 

4. Review and approval of data quality prior to reporting. 

5 . Training of analysts . 

5.1. In-House Facility 

Agency laboratory must have sufficient space, proper equipment, and 
experienced analysts to perform all amalyses. . 

Each laboratory should prepare a laboratory manual which includes 
QA/QC provision aind should have a full time QC Officer, when possible. 
The laboratory manual should encompass the following SOPs : 

1. Analysts' qualification requirement. 

2. Sample logging system. 

3 . Equipment maintenance and repairs . 

4. Glassware cleaning and sample container preparation. 

5 . Analytical methodologies . 

6. Equipment calibration and method validation. 

7. Internal and external QC frequencies and types of QC samples 
to be performed, inter laboratory performance studies, etc. 



21 



3 . Recordkeeping . 
9. Data validation. 
10 . Data reporting - including deviations from standard procedures 
or inconsistent results in the report. 

5.2. Contract t.aboratory 

As similar with in-house Laboratory, contract laboratory must have 
sufficient space, suitable eqiiipment, well documented QA/lab manual, 
qiialified full time QCO, and experienced analysts to conduct the analyses 
to be contracted. In addition, laboratory bidding for contract shall 
submit with the bid its laboratory QA plan/mzuiual for review and approval. 
Prior to granting the contract, the laboratory must analyze 
satisfactorily the performance samples submitted by the contract agency. 



22 



5. CALIBRATION PROCEDORES AND FREQUENCY 

6.1. Field Equipment 

All field equipment shall be calibrated and serviced, if necessary, 
prior to be taken out to the field and calibrated again in the the field 
prior to measurements . Calibration procedures should be in accordance 
with manufacturer's specifications. The pH meter must be calibrated in 
the field with three oH standards before daily use and should be checked 
with one standard after each measurement. All calibrations and repairs 
should be recorded in the logbook. 

6.2. Laboratory Equipment 

Calibration procedures and frequency shall be established and 
documented in the laboratory's quality control manual. All calibration 
records including the date and name of the person conducting the 
calibration should be documented in detail in a bound notebook. 



23 



7. ANALYTICAL PROCSDimES 

Laboratories generating data for the San Joaquin Valley Drainage 
Prograun shall prepare an easy to follow written protocol of each 
analytical method for daily ase and for training new analysts. The 
protocol should also include detailed sample preparation, extraction 
and/or digestion components. 

Prior to the use of any analytical procedure for analysis of 
environmental sample , the laboratory must establish the analytical 
precision and accuracy of individual analysts and of the laboratory as a 
whole . 

The preparation of standards and reagents used for equipment 
calibration and method standardization should be recorded in a bound 
notebook: The name of the person preparing the standards and/or regents 
and the date of preparation should also be recorded in the notebook amd 
on the container as well . 

If a necessiiry analytical procedure is not available in any of the 
analytical manuals, an official procedure published in the Association of 
Official Analytical Chemists or in a respectable scientific journal may 
be used. Also, procedures developed by a participating laboratory that 
generate data with acceptable P and A may be used. 

7.1. Analytical Procedures 

To ensure data precision and accuracy, well-controlled, proper 
amalytical methodologies must be used. The laboratories generating data 
for SJVDP should use uniform or equivalent analytical methods; e.g., 



24 



EPA-approved procedures for analysis of 'vater and wastewater. Standard 
Methods for the Examination of Water and Wastewater, or equivalent 
procedures published by USGS or in ASTM. Most of these methodologies 
describe chemical analyses in water matrix- For chemical determination 
in other matrices, the same analytical procedure; e.g., atomic 
absorption, furnace AA, hydride generation, neutron activation, etc., can 
be used. However, the sample must be digested or extracted prior to 
analysis. Analytical results of soil, sediment and biological samples 
vsiTY greatly depending on the digestion/extraction procedure used. Once 
the digestion/extraction/analytical procedures have been selected, the 
laboratory should establish the overall precision and accuracy and must 
follow the procedure closely including instrumental calibration, 
standardization and QC requirements specified in each procedure. 

Any modifications must be documented in detail. At a minimum, laboratory 
should analyze one duplicate analysis, one standard reference sample and/ 
or one spiked sample (addition of toown quantity of standard to the 
environmental sample) in every 10 samples or in a batch of samples less 
than 10. 

7.2. Analytical Procedures for Air Samples 

Ose of EPA procedures for air analysis is recommended. Such 
procedures are presented in 'Quality Assurance aandbook for Air Pollution 
Measurement Systems" Vol. 1, EPA-600/9-76-005 , March 1976; Vol. 2, EPA- 
600/4-77-027a, May 1977; Vol. 3, EPA-600/4-77-027b, Aug. 1977; and 
supplements. 



25 



7.3. Records and Documentation 

Any modifications of procedures must be documented in a logbooJc 
in detail. Preparation of standards and regents, used equipment 
calibration and method standardization should be recorded in a bound 
notebook. The name of the person preparing the standards and/or reagents 
and the date of preparation should also be recorded in the notebook and 
on the container. 

7.4. Reference Materials for Chemical Analysis 

1. Methods for Determination of Inorganic Substances in Water and 
Pluvial Sediments, Techniques of Water Resources Investigation, 
Book 5, U.S. Geological Survey, 1979. 

2. Pishman, M. J., and Bradford, W. L. A Supplement to Methods 
for the Determination of inorganic Substances in Water and 
Pluvial Sediments, D.S. Geological Survey Open-Pile 

Report 82-272, p. 136. 

3. Methods for Chemical Analysis of Water and Wastes. 
EPA-600/4-79-020, U.S. Environmental Protection Agency, 
March 1983. 

4. Test Methods, Methods for Organic Chemical Analysis of 
Municipal and Industrial Wastewater. EPA-600/4-82-057, O.S. 
Environmental Protection Agency, July 1982. 

5. Guidelines Establishing Test Procedures for the Analysis of 
Pollutants Onder the Clean Water Act; Final Rule and Interim 
Final Rule and Proposed Rule, Federal Register Vol. 49, No. 
209, October 26, 1985. 



26 



6. Standard Methods for the Examination of Water and Wastewater, 
15th or latest edition. 

7. Methods Manuals of the Association of Official Analytical 
Chemists, 13th Edition, 1980. 

3. ASTM Annual Book of Standards, Pajrt 31, "Water." 

9. Analysis of Pesticides Residues in Human and Environmental 

Samples, Health Effects Research Laboratory, Environmental 

Toxicology Division, EPA, 1977 rev. ed. 

10. Validation of Analytical Method, John K. Taylor, Anal, Chem. , 
Vol. 55, 600A-608A, 1983. 

11. Test Methods - Methods for Nonconventional Pesticides Chemicals 
Analysis of Industrial and Municipal Wastewater, 

EPA 440/1-83/079C, Jan. 31, 1983. 

12. Procedures for Handling and Chemical Analysis of Sediment and 
Water Samples, Russell H. Plumb, Jr., Environmental Lab., 
U.S. Army Engineers Water and Waste Experiment Station, 
Vicksburg, Mississippi. 

7.5. Reference Materials for Biological Studies 

1. Finney, D. J., Statistical methods in Biological Assay. Hafner 
Publishing Co., New York, NY, 1964. 

2. Green, R- H., Sampling Designs and Statistical Methods for 
Environmental Biologists. John Wiley and Sons, NY, 1979. 

3. Sokcil, R. R. and F. J. Rodlf , Biometry. W. H. Freeman and Co., 
San Francisco, CA, 1969. 



27 



4. Tetxa Tech. Bioaccumulation Monitoring Guidance: Selection of 
Tatrget Species and Analytical Detection Limits . Prepared for 
D.S. Environmental Protection Agency, Washington, DC, 1985. 

5. State-of-the Art Analytical Techniques for the Determination of 
Halogenated Organics and Trace Elements in Biological Tissues , 
O.S. Pish auid Wildlife Service. 

6. Procedures for Analysis of Selenium and Arsenic in Pish and 
Wildlife Tissues with Emphasis on Quality Control, Lab. Report 
No. 86-3 Ancilytical Chemistry Unit. Fish and Wildlife Water 
Pollution Control Lab., Department of Pish and Game, Oct. 1986. 



28 



8. DATA REDUCTION, VALIDATION, AND REPORTING 

Each laboratory shall identify the individuals responsible for 
handling data, and should work with project manager/QAO to determine QA 
inforaation to be transmitted with the data. Laboratories shall provide 
verified data sets to the Project/agency QA Officer with adequate 
laboratory QC data to allow independent data validation. Laboratories 
are responsible for reporting accuracy and the Pro ject /agency QA Officer 
is responsible for final data validation based upon reported results of 
the blind field QC samples . Should the results of field QC samples are 
unacceptable, laboratory analyses should be reviewed and proper 
corrective action must be taken. As data is found to invalid, resampling 
should be requested. 

3.1. Data Reduction 

Data reduction should include those mathematical and/or statistical 
procedures which are used to convert the raw environmental measurement ( s ) 
into the final use form. Quality control procedures should be carefully 
designed to eliminate errors during these processes . Calibration 
procedures should be described, to the extent possible, in analytical 
SOPS. SCPs should describe the review and cross-check procedures used 
for data reduction process, and should also completely cover the stepwise 
procedures for entering data onto various forms and into the computer 
systems used for data reduction. 



29 



Laboratories calculate their results in accordance with the 
directions specified in the individual analytical methods. Data must be 
reduced to and reported in the units used in the data storage system such 
as STORET. 

8.2. Data Validation 

Data validation should include checks for quality control data, 
transmittal errors, verification of laboratory capability, etc. 

Laboratory must verify all analytical results prior to reporting. 
Veilidation should be based on the QC data generated simultaneously with 
the batch of samples cuialyzed. Data could be assumed valid if the 
following criteria are met: 

1. In each batch of Scimples, the spike recovery is within the 
upper and lower control level; e.g., two standard deviations 
from the mean spike recovery. For soil, sediment and 
biological samples , results of analyses by two different 
methods or from two laboratories are the same or equivalent. 

2. In each batch of samples, the difference (ranges) between 
replicate analyses falls within two standard deviation from the 
mean value. 

3. Results of clLI reference seunples are acceptable, i.e., meeting 
the predetermined P and A. 

If QC data do not meet the above criteria, the samples in the batch 
shall be reanalyzed. If the reanalyzed results remain unacceptable, data 
should be invalid and resample should be conducted, if possible. 



30 



Accuracy of soil , sediment or biological analyses are more 
difficult to validate. One should be careful to compare results of split 
samples analyzed by more than two laboratories, particularly for 
heterogeneous samples such as soil, sediments or biota. Similar results 
from two laboratories using the same procedure or different procedures 
do not necessarily mean the data are accurate. A laboratory that analyzes 
reference samples accurately may not analyze enviroiunental samples 
accurately. 

8.3. Data Reporting 

Project manager or agency QAO should provide reporting requirements 
to the laboratory. 

Laboratory should provide the Project /agency QA Officer the verified 
data sets and other required' information. These analytical reports shall 
include data of laboratory spike recoveries and percent difference of 
duplicate analyses. 

Analytical reports shall be signed by the laboratory director. 
Data submitted by contractors must be in the units and in a format 
consistent with that of the agency data system. 

Typical data report forms are presented in Appendix 1. 

8.4. Data Assessment 

Procedures and criteria used for assessing data validity and 
comparability with historical data should be developed by the Project 
Officer in conjunction with QA Officer. 



31 



The Project/agency QA Officer is responsible for conducting checks 
on analytical results of blind spiked and blind duplicate/replicate 
samples. If precision and/or accuracy of blind samples are outside of 
established acceptance limits , one or more of the following actions 
should be t2Ucen: (1) Review laboratory analyses and performance, 
(2) reanalyze the entire batch of samples, and (3) take other appropriate 
corrective action, such as resample- The Project QA Officer is also 
responsible for checking data consistency with historical data, 
laboratory protocols, and finally, a check for complete adherence to the 
quality control elements. 

8.5. Data Storage 

After validation, acceptable data should be transmitted to the data 
management staff for storing in the data system. The data management 
staff is responsible for data entry, verifying the reported units in 
consistency with those in the data base, verifying the data qualifiers 
being used, and the level of details needed, and proofreading data 
entries . 

All data generated from these studies should be stored in a 
centralized data system, e.g., the OSEPA STORET system, and should be 
readily available for the use of engineers , program managers , and 
scientists involved in drainage studies . Most agencies also store their 
data in their in-house data base. 

On a yearly basis, data management and the Agency QA Officer should 
conduct a review/audit of the data base. 



32 



3.6. Reference Materials 

1. ASTM Meuiual on Presentation of Data and Control Chart Analysis, 
ASTM Philadelphia, PA, 1975, STP, 15D. 



33 



9. INTERNAL QUALITY CONTROL CHECKS 

Internal quality control is the routine activities auid checks , 
such 3LS periodic calibrations , duplicate analyses , use o£ spiked samples , 
etc . , included in normal internal procedures to control the accuracy and 
precision o£ a measurement process . 

9.1. QC Check for Sampling 

Each measurement system shall develop predetermined limits to 
identify when corrective action is required before data becomes 
unacceptable . 

Bach study plan should address controls of accxirate location of 
sampling stations , the rationale used in selecting the number and location 
of those stations, controls of sample collection and saunple handling, 
and the types auid frequency of QA/QC samples . The types and frequency 
of QA/AC samples are commonly used as tabulated below: 



34 



Table 1. Level of Effort in QA Sampling 



Data Characteristics 
Evaluated 

Field/ transfer 
contamination 



Sampling equipment 

contamination 

( for sediment or soil ) 



Sample Ttpq 
Field blank 

Field blank 



Minimal Effort^ 

One per 10 samples 
(one per batch of 
samples less than 10) 

Equipment rinse, as 
appropriate 



a Where two alternatives are cited, the alternative results in more 
frequent amalysis should be used. 



Acctiracy ( field 
variability) 

Accuracy 



Precision (field and 
lab variability) 



Sample spiked 
(submitted as blind) 



One per 10 samples 



Blind reference samples One per 20 samples 

20-50% of samoles 



Split samples (with 
other lab) 

Field duplicates/ 
replicates (submitted 
as blind j'^' 



One per 10 samples 



b Field duplicates (aliquots from same sample) measures the 

contamination from sample container and variations of analytical 
process. Field replicates (consecutive sampling within 1 minute) 
measure vaLriation induced by sampling procedures plus those measured 
by field duplicates. 



9.2. Laboratory QC Checks 

Laboratory QA plan/manual shall describe in detail the QC practices ; 
the type and frequency of QC samples (laboratory spiked samples, duplicate 
analyses, reference samples, blanks and standards) to be analyzed, 
statistical procedures for establishing QC limits, preparation of QC 
charts and procedures for verification of data quality. The types and 
frequency of QA/QC analyses commonly used are tabulated below: 



35 



Table 2. Level of effort in laboratory QA 

Data characteristic 

evaluated Sample type Minimal effort^ 

Laboratory - Reagent blank One per batch analyzed 

cent-ami nation less them 10. 

Accuracy (lab Spike (known addition) One per 10 samples 
variability (One per batch analyzed, 

less than 10 samples) 

StandcLTd reference One per 10 samples 
materials (One per batch analyzed, 

less than 10 samples ) 



a tVhere two alternatives are cited, the alternative results in more 
frequent analysis should be used. 

9.3. Reference Materials for Laboratory Quality Control 

1. Quality Assurance Practices for the chemical and Biological 
Analyses of Water and Pluvial Sediments. Techniques of Water 
Resources Investigations of the United States Geological 
Survey, Book 5, Chapter A6, 1982. 

2. Handbook for Analytical Quality Control in Water and Wastewater 
Laboratories, EPA 600/4-79-019, March 19*79. 

3. Manual of Analytical Quality Control for Pesticides and Related 
Compounds in Hi xm a n and Environmental Seunples. A Compendium of 
Systematic Procedures Designed to Assist in the Prevention 

and Control of Analytical Problems. EPA Health Effects Research 
Lab., Research Triangle Park, North Carolina, 1976. 
" 4. Quality Assurance of Chemical Measurements, John K. Taylor, 
Analytical Chem. Vol. 53, No. 14, 1589A-1596A, 1981. 



36 



5. Principles of Environmental Analysis, Committee on Environmental 
Improvement, American Chemistry Society, 1983. 

6 . Guidelines for Data Acquisition and Data Quality Evaluation 
in Environmental Chemistry, Anal. Chem. Vol. 52, 2242-2249, 
1982. 

7. ASTM Manual on Presentation of Data and Control Chart Analysis, 
ASTM, Philadelphia, PA, 1976; STP 15D. 

8. Manual for Analytical Quality Control for Pesticides and Related 
Compounds in Human and Environmental Samples, EPA-600/1-79-008. 

9. Calculation of Data Quality Indicators, EPA/EMSL, Research 
Tria'ngle Park, NC; and Las Vegas, NV, 1983. 

10. Environmental Measurement Method Performance Data for 
Establishing Achievable Data Quality Goals, EPA/EMSL, Research 
Triangle Park, NC, 1983. 

11. The Use of Natural Waters as U.S. Geological Survey Reference 
Samples, Victor J. Janzer, Special Technical Testing 867, 
ASTM, pp. 319-333, 1985. 



37 



10. PERFORMANCE AND SYSTEM AUDITS 

Perforsance and System Audits are essential QA activities to ensure 
that laboratories have the capability to generate scientifically and 
statistically valid data. However, one should not assume that all 
environmental data from a laboratory which performs satisfactorily the 
audit samples aire valid. 

These audits are categorized into internal and external audits . 

10.1. In-house Performance and System Audits. 

Internal performance and system audits of sampling operations 
should consist of onsite reviews of field activities, field qxiality 
assurance systems, equipment used for sampling, calibration and 
measurements. These shall be conducted by experienced personnel of other 
groups or other intemad. offices. The audits should include review of 
QA/QC plan implementation euid the items such as documentation of field 
observations and deviations from sampling procedures, if amy, 
completeness of field forms, field data storage and filing procedures, 
calibration records of field and laboratory equipment, and saunple custody 
sheets, etc. 

In-house laboratory and contracted laboratory shall participate in 
and participate satisfactorily on all agency's performance (reference 
sample) studies. If any unacceptable result is generated, the laboratory 
must correct the problems that cause the unacceptable result before it 
can perform that amalysis of environmental samples. In that case. 
Project Officer jiay find a qualified lab to do the analysis during the 
interim period. 



38 



Internal system audits of laboratory operations will consist of 
review of documentation of analytical and QC procedures , QC charts , 
record of perf orm2mce studies , record of equipment calibration and 
services , and documentation of extraction amd analytical processes . 

NBS, E3>A, aind DSGS standard reference Seunples and specially 
prepared reference samples may be used for performance audits or 
interlaboratory comparison studies. Procedures and record of preparation 
of the audit samples must be described in detail and a copy maintained 
by the QAO. 

10.2. External Performance and System Audits 

Laboratories should participate in reference sample and 

interlaboratory comparison studies sponsored by other agencies, whenever 

possible. 

It is important that performance and QA system audits of field and 

laboratory operations be conducted by external audit groups, e.g., EPA, 

National Academy of Science, etc. 

Specific interlaboratory comparison studies should be initiated 

for SvP/DP. All laboratories generating data for SJVDP should be mandated 

to participate. The sponsor laboratory should be responsible for 

oversight and evaluation of the interlaboratory comparison study. It is 

recommended that the SvT/DP Program QA Officer be involved in the study 

evaluation. 



39 



11. SPECIFIC ROUTINE PROCEDURES USED TO ASSESS DATA 
PRECISION, ACCURACY, AND COMPLETENESS 



The assessment of data precision, accuracy, completeness, and blanks 
will be made by the Project Officer or QAO. 

11 . 1 . Assessment of Precision 

The mean, C^f of a series of replicate measurements if 
concentration, Cj_, is calculated as: 

n 



^n" n ^ 



L.\^ 



i =• 1 
where n = number of replicate measurements; C, Cj_ are both on mg/L or 
mg/Kg 

The estimate of precision of duplicate measurements is expressed as the 
relative percent difference (RPD): 

RPD - ^^^ X 100 

The RPD calculated will be compared with the respective goals identified 
in the Quality Assurance Program/Project Plan. 

The estimate of precision of a series of replicate measurements 
(primarily used in GC/MS analysis) is expressed as the relative standard 
deviation (RSD) where: 



(Ci-C)2 

RSD - + / i -""l 



7~ 



40 



11 . 2 . Assessment of Accuracy 

Accuracy is monitored by analyzing standards , samples that have 
been spiked with standard solutions, and control samples. Approximately 
10 percent of all samples are analyzed as spiJced samples. The acoiracy 
of new analytical methods is verified by analyzing replicate spiked 
samples intermixed with routine samples. 

Accuracy is evaluated by comparing the mean recovery of spiked 
analytes against the predetermined Accuracy objectives. The recovery 
of a spiked analyte is defined as: 



Recovery, % = Total Analyte found - Anal^/Te originally present X 100 

Analyte added 



11.3. Assessment of Current Data Based on Historical Data 

Current data should be further assessed using statistical 
variations of historical data, and significance tests: e.g., student 
test, etc. Basic statistics of historical data of an individual sampling 
site, with n-number of data should be computed: 



o Sample mean Xj_ = I.X-1+X2-* ^n^/n 



o Sample Standard Deviation. 



s= 



n-1 



(Xi - X)' 



i=l 



The 95 percent confidence limits (X + 1.96s) of historical data are 
frequently used to assess data acceptance. Data outside of these limits 
are considered to be questionable. Project Officer and/or QAO should 



41 



consult a statistician to develop the assessment criteria for SJVDP ' s 
data. If data are found to be questionable, it should be so indicated 
in the data base. 



42 



12. CORRECTIVE ACTION 

Each agency or laboratory shall define and docioment the criteria that 
corrective actions will be taken for the failure in performance audits/ 
studies, deficiencies revealed during system audit, analytical results 
fall outside of acceptance limits , etc . 

12.1. Categories of Corrective Actions 

1. Handling of sample analysis, data processing, or equipment 
failures , 

2. Handling of nonconformance or noncompliance with the QA 
objectives for measurement data that have been set forth (as 
section 3 ) . 

12.2. Person Responsible for Initiating Corrective Actions 

12.2.1. When a result is deemed unacceptable by the laboratory QC 
Officer or laboratory director from information obtained 
on a laboratory spike, duplicate/replicate analyses, 
blind reference samples or from a control chart, lab 
director or QC Officer should request reanalysis of the 
sample set. 

12.2.2. When results of field QC samples are deemed -onacceptable, 
the Project /agency QA Officer should request the laboratory 
to reanalyze the samples, or requests for resampling. 



43 



12.2.3. When an analytical result in a performance audit/study is 
unacceptable, the laboratory director or the agency QA 
Officer will suspend such analysis until the problem has 
been revealed and corrective action be implemented. 

12.2.4. When a system audit reveals an unacceptauble performance, 
the laboratory director or the agency QA Officer should 
request work be suspended until the problem has been 
revealed and corrective action be implemented. 

12.2.5. When an error of data processing (such as data coding, 
data entry, etc.) is found, the manager/supervisor of 
data management should initiate the corrective action. 



44 



13. QA REQUIREMENTS FOR PROCUREMENT OF SAMPLING 

AND ANALYSIS 



When an agency procures any data generating activities , QA 
requirement for the procurement shall be established. The requirements 
should encompass the QA of procurement process auid QA requirements of the 
contractor . 

13.1. QA of Procurement Process 

The contract procurement process begins with conception of the 
work to be performed. This conception is translated into a solicitation 
of a procurement request rationale docxament. QA requirements must be 
clearly specified in the "scope of work" section of the procurement 
request rationale document and must be given a proper weighting in the 
evaluation criteria. QA requirements shall include: submission of QA 
plan/manual for review, performing reference samples, and preaward site 
evaluation. The QA requirements delineated in the procurement request 
rationale document should be described in the Request for Proposal (RFP) 
and will eventually become a part of the contract. Examples of QA 
requirements for contract laboratory and QA review forms are provided here. 
Technical evaluation of all proposals should be conducted by Project 
Officer and QAO, even better by a technical evaluation panel. Qualified 
offerors are identified on the basis of technical review. 

Written or oral discussion should be conducted with all responsive 
offerors who have submitted proposals within the competitive range. 
During these discussions, any uncertainties should be resolved concerning 



45 



the offeror's compliance with the specified QA aspects of the effort. 
Following the receipt of "best and final" offers, it is of value to 
conduct a preaward site evaluation of the offerors in the final 
competitive range. The preaward evaluation provides an ideal 
opportunity for the Project Officer to gain an indepth knowledge of the 
QA programs of vsu:iou3 offerors. Contract should be awarded baaed upon 
the technical evaluation, and, secondly, the business/financial 
evaluation. 

13.2. QA Requirements of Contractors 

Contractor must have a comprehensive QA plan/manual that is 
approved by the contracting agency. Contractor QA plan must contain all 
appropriate QA elements described in this guidelines document and shall 
be strictly followed. Contract laboratory should be required co analyze 
performance evaluation samples periodically amd to parliicipate in 
perform£mce audits at the laboratory's expense. Results of performance 
evaluation samples should be reported within the specified time. QAO 
should arrange for delivery of the performance samples and evaluate the 
results. 

Contract laboratory must comply with the QA requirements stated 
in the contract in addition to its internal QA/QC provisions. 

13.3. Examples of QA Requirements and QA Review Form. 



46 



QUALITY ASSURANCE REQUIREMENTS 

1. The contractor shall submit for review prior to the award of contract 
a copy of the Quality Assurance Program which will include, but not be 
limited to the following: 

a. A written standard operating procedures (SOP) designed to 
control the flow of samples through the contracting laboratory, 
e.g., procedures detailing receipt of sample, tracking of sample 
through preparation, analysis, and reporting of data. 

b. A written procedure of qucility control practices on instruments, 
equipment, reagents, supplies, and analyses to assure that data 
generated is of acceptable precision- and accuracy. 

2. The contractor shall routinely challenge its analytical system with 
duplicate and spike analyses (one of every 10 samples), internal 
standards, and unknown check samples. 

3.. All calibration, tuning, and quality control requirements included 
for a given analytical method shall be strictly followed. 

4. The contractor shall participate in performance evaluation studies 
designated by project officer for paraimeters covered by the contract. 
Performance evaluation samples, split samples and blind seunples may 
be supplied to the contractor by project officer. The contractor 
shall follow the instructions provided with the samples and 
analyse them quiintitatively . 

5 . The contractor shall operate its own internal quality control program 
for an overall measure of performance. As samples are analyzed over 

a period of time, the Project Officer will monitor the data and review 



47 



all problems with the contractor. The contractor shall be responsible, 
at its ovm expense, to correct all discrepancies. The Project Officer 
will furnish the contractor with additional samples in order to verify 
that amy identified problem has been resolved. These additional 
samples shall be analyzed by the contractor at its own expense. 
Analytical results reported shall ijiclude quality control data 
produced by the contractor to ensure that the results are of 
acceptable quality. 

The contractor's laboratory may be evaluated by an onsite visit 
prior to the award of contract. 

For additional information concerning quality assurance requirements , 
contact the project officer. 



48 



QDALITY ASSURANCE REVIEW FOR CONTRACTED LABORATORY 

(CONTRACTS) 



I. GENERAL INFORMATION 
Descriptive Title: 
Sponsoring Program Office: 
Approximate Dollar Amount: 
Duration; 



II. THIS CONTRACT REQUIRES ENVIRONMENTAL MEASUREMENTS 

(If yes, complete form; if no, sign form and submit Yes No 

with procurement request) 

III. QUALITY ASSURANCE REQUIREMENTS 

(Projects requiring environmental measurements) Yes No 

Submission of a written quality assurance (QA) 
program plan (commitment of the offeror's management 
to meet the QA requirements of the scope of work) is 
. to be included in the contract proposal. __ 

Submission of a written QA project plan is to be 

included in the contract proposal. __ 

A written QA project plan is required as a part of 

the contract. 

Performance on available audit samples or devices 

shall be required as part of the evaluation criteria 

(see list on reverse side). 

An on-site evaluation or proposer's facilities will be 
made to ensure that a QA system is operational and 
exhibits the capability for successful completion of 
this project (see schedule on reverse side). 

QA reports will be required (see schedule on reverse side. 



IV. DETERMINATION (Projects involving environmental measurements) 
Percentage of technical evaluation points assigned to QA. _ 



Project Officer estimate of percentage of cost allocated 
to environmental measurements . 



QA Form QAR-C, Revision No. 1, 1981 



49 



Parameter 
Measured 



QC Reference 
Sample or Device 
Available 
(Yes or No) 



Split Samples 
for 
Cross-Comparison 
(Yes or No) 



Required 

for 
Preaward 
(Yes or No) 



Frequency 
During Contract 



QA System Audits are required: Preaward 



_; during Contract 



QA Reports are required: With Progress Reports ; with Pinal Report 

with Analytical Reports . 

The signatures below verify that the QA requirements have been established, 
QA Officer: 



Signature 



Date 



Signature 



Date 



After signatures, a copy of this form must be included with the Request for 
Proposal and sent to the Contracts Office and a copy placed on file with 
the QA Officer. 



QA Pona-^AR-C 



50 



14. TRAINING PROGRAM 



An important component of a QA program is the training of new staff 
and the maintaining of the competency level of existing personnel. Staff 
should be familiarized with the QA program and related activities. 
Training can involve both internal and external activities. A training 
program should be developed and implemented in close coordination with 
management and each individual staff. 



51 



15. QUALITY ASSURANCS REPORTS TO MANAGEMENT 

Quality Assurance Reports to the Project Officer and the Program 
Manager shall be inade periodically by the QA Officer. QA reporting 
should tie to the completion of various elements of the QA Program Plan 
rather than to general time periods. The reports should contain 
information regcirding data accuracy, precision, completeness, result of 
system and performance audits along with any reports of corrective 
action, sample alternative or other significant QA problems. Through 
this mechanism, effective solutions to project problems may be developed 
and applied. 



52 



16 . SAFETY 

Field operation office and laboratory should prepare a safety 
procedures plan for their operation to prevent illness or injury and 
the necessary actions to be taken in the case of injury or illness 
while conducting field or laboratory work. 



53 



17. QA IMPLEMENTATION SCHEDULE 

Every agency shall prepare a yearly QA implementation schedule. 
Following is an example of QA Implementation Schedule. 

16.1. OSBR FY 37 QA Implementation Schedule 

Activity Items Projection Date 

1. Complete SOPs for all sampling 

2 . Complete chain of custody procedure 

3 . Complete drainage water reference sample study 
4 • Initiate soil reference sample study 

5. Initiate vegetation reference sample study 

6 . Field audit 

7. Initiate blood/urine reference sample study 

8. Compete lab QA Project Plan revision 

9 . Complete all QC charts 

10. Prepare all laboratory protocols 

11 . Complete data validation procedures preparation 



54 



12. Prepare data reporting procedure 

13 . Complete development of graphite furnace 

capability 

14 . Complete development of capability for 

pesticide analysis 

15. External lab audit (USGS) 

16. Establish data acceptance criteria (documented) 

17. Establish data assessment procedure (written) 

18 . Begin streamlined data management system 



55 



APPENDIX 1 



TYPICAL DATA REPORT FORMS 



1-1 



FORM 1 

ANALYSFS DATA PACKAGE 

Date Submitted Date Received 

Batch No. QC Repoirt No. 

Laboratory name 

Contract No. 

Field ID Lab ID Field ID Lab ID 



Footnotes and General Comments: 



Ki^ii 2. 

USBR SAMPLE -NO. 
AJTALTSIS DATA ScZZT 



LA3 ilAME: 



BATCH NO. 



LAB SAKPLZ ID N0._ 



QC JLZPOS.T NO. 



DATA aZLEASc ADTHOELLZZD 3T: 



1. Aciilcy 

2. AlkallalC7 

3 . Aluninvm 

4. Antimony 

5. Arsenic 

6 . Sarlua 

7. Berylliua 

8. Boron 

9. Broolde 

10 . Cadnlum 
11.- Calciua 

12. Chloriae 

13. Chroaiua 
14.^-Cob*lC 

. 13.- COD- 

16» Go^er 
i;^.r-Cy»nld« 



ANALTTICAL VALUES 

Concencracion 
Value 



Method No. 



la . DOC 



I 



ANAL TS IS DATA SHEZT (7ora Z coac'd) 



ConstiCuent 



1/ 



Concencracian 

Value 



Method No 



19. Fluoride 

20. Iodide 

21. Iron 

22. Lead 

23. Uthlum 

24. Magneslua 

25. Manganese 

26. Mercury 

27. Molybdenum 

28. nickel 

29. pH ^ 

30. Pocasslua 

31. Seleniun • 

32. Silica 

33 . Silver 

34 . Sodiun 

35. Solids, .laO-C 
sTrSpeciflc Conductance 

37. Scronclum 

38. Soliace 



I 



39. Turbldl^ 
40~. Vanadlua 
41. Zinc 



ANAL TS IS DATA SHETT (Fora 2 coat'd) 



Caclon-Anlon Difference • Z 

Coaaencs : 



V. Only chose conscicuenta reqnesced vlll be lacluded 













•a 




















Ui 

2 


Ui 

2 


■ 






1 


*= ? 






















t- 


h- 


K 
Ui 






U 






















Ui 


Ui 

t— 

























Z 


























< 


< 




























o 


a 




a 


































— > 






Ui 

a: 






















Ui 

ir 


Ui 

<r 

3 






< 






















y- 


< 

2 

o 


1/5 I/' 




Z 






















< 
2 


3 a. 






























O 3 




LU 






















o 


UI 






a: 






















U7 

a 

a 


33 

< 
-< 


OG 
























Ui 


> 






i— 




1 




















z 

UI 

3 
O 

2 
-J 


31 

a 

Ui 

> 

Ui 

u 

UJ 

a: 




o 








i 














1 -1 =i 

02 2 

Ui 



























en 

UJ 

UJ 
























Ui 


a. >■ <f> 
a. 1- "1 

— — tij 

























Ui 

2 


Ui 

2 


X a: c 
(« o a. 
























UJ E X 




m 
w 
v> 

> 

< 

2 
< 
























Ui 

a 


Ui 

< 


-1 a. UJ 

0. 
























2 oa 

< 
























UI 


Ui 






























1 
1 


















































3 • 
< 


Ui 

a: 

3 




1 
























- 


Sa3NI71 NOO 
























< 

2 




1 
1 




iO M38wnN 






















"■* 


o 




1 


-.- 
























> 
- (U- 

a 


> 


— 








1 • 

1 






















j 
































Ui 


s: 




































X 






































(/5 


a 




































3 


Ui 




■ 


e 








• 






















O 

2 


> 

Ui 




i 


o 








H- 








t 














-1 


u 




1 


« 2 








• 






















Ui 


Ui 

IE 










































2 ►- 




.^ 




E 

e 






















Ui 

2 


Ui 

2 






>^ u. ^ 




UI 




«> 






















h- 


t— 








Ui 


Ui 


































2 
< 


e 

3 


























Ui 


UJ 




) 


V) Ui 


2 


»— 


























< 


< 




j 


. u. a: 
' o o 




«x 








1 














o 


a 




1 


Ui 


2 
O 






^ L 1 














' 





























,' 2 Z 


-> 


«, 




■^a 




















_ 






1 


' 3 Ui 3 


o 


























Ui 








a < 


E 






.J 




















X 


** 




1 


^« ■ 


a. 


Ui 
















1 1 






>- - a: 




— I. 


" ri 




t- 




a 














1 




< 
z 


3 




1 


Ui 




^ 




g 




















o 


«* 






o < 


r 


Q 




K 






















m 


r 






1 

] : 

• 

= : 

6 V 
o 


s 
<f 
z 
t 
Z 

li 


2 
< 

> 

e 
a 

Ui 

-J 


> 
c 
o 

< 
























31 
O 

3 ^ 


D 

3 

u 







■o 


1 




















,r- - 

UI 


























0. 


o 


•« ^ 






















o : 


> 


X 




; ( 


3 


2 


39 


• 






















5 : 


J 


K 




. c 


c 


< 


< 
























-1 <. 




< 


1 


w 


L 


L. 


(O 


^ 























UJ L 




2 

Ui 

X 


i 





































FORM 4 



U3 ^Ai^ 

DATE 



QUAATERLI INS7RUMZNT DETECTION LIMITS 

BATCT NO. 





Xeauired 

Detection 
Uaits (CRDL)l'^ 


lastruaent Detection^' 
Limits (IDL) 


Conscicuenc 


Alternate 
IDL [ Method So. ] IDL Method No.^^ 


1 . Alxmiaum 


10* 1 










2. Ancisony 


1* 










3. Arsenic 


1* 










4, Bariun 


100* 










5. Berylllua 


10* 










6. Boron 


20* 










7. Brotaide - 


.1 










8 . Cadaiua 


1* 




• 






9. Calciun 


.1 










10. Chloride 


.2 










11. Chromiua 


1* 




1 

I 




12. Cobalt 


3* ■ 










13. COD 


10 










14. Copper 


1* 






i 


15. Cyanide 


.01 








- 


16. DOC 


.1 




1 






._ 17. Fluoride 


.1 










—ISi..^ Iodide 


.01 


1 








.._.19. Iron 


10* 










20. Lead 


1* 











1-7 



FOR.M d (cont'd) 






quAiorzRLY iKSTRUMiirr detsction limits 

BATCH NO. 



onsticueac 



Required 
Dececrian 
Ualta (OlDU 



1/ 



laatrunent Deceerlaa^ ^ 
Lialt3 (IDL) 



IDL 



Method No. 



IDL 



Alternate 
Method No, 2/ 



1. 
2. 
3. 
4. 

5. 

6. 

7. 
.8. 
9. 
0. 

1. 
i2. 
3. 



Llthlua 



10* 



Magnesiun 



.1 



Manganese 



10* 



Mercury 



.1* 



Mol7bdenua 



Nickel 



1* 



Poeassiua 



.1 



Selenium 



Silica 



.1 



Silver 



Sodius 



.2 



Scrontlua 



10* 



Sulfate 



.2 



4, Vanadlua 



1* 



5. Zinc 



10* 



' Unlabelled C7CDL are in ag/L units, ♦ denotes ug/L 

Canplete only if sore than one aethod is used for a specified detemlnation 



FORM 5 



KB NAME_ 
ME 



Q. C. REPORT NO, 



INITIAL AND CONTINUING CALIBRATION VERIFICATION 
BATCH NO. 



onstituent 


Control 
Limits % 


Initial Calib.— ^— "^ Continuing Calibration- 






True 
Value 


Found 


%R 


True 
Value 


Found 


%R 


Found 


%R 


Method 




85-115 





















. Acidity 






. Alkalinity 


93-107 























Alnminnm 


90-110 


_ 




















. Antimony 


90-110 


























. Arsenic 


90-110 
























. Barium 


90-110 




. _,_ 

















. Beryllium 


90-110 





















• 


. Boron 


85-115 






















, Bromide 


90-110 




__ — -— 






' 










1. Cadmium 


90-110 




















.. Calcium 


90-110 


___—.__ 




















:. Chloride 


93-107 


_-— — — 


__— — 

















1. Chromium 


90-110 * 






— — 
















1. Cobalt 


80-120 






— — 


















i. COD 


80-120 





















i. Coooer 


90-110 




















. Cyanide 


35-115 

























. DOC 


80-120 

























. Fluoride 


90-110 


























. Iodide 


80-120 


___„ — - 



















. Iron 


90-110 


























i Lead 


80-120 
























1-9 



\3 NAME_ 

iTE 



FORM 5 



(cont'd) 



Q. C. REPORT NO. 



INITIAL AND CONTINUING CALIBRATION VERIFICATION 
BATCH NO. 



jnstituent 


Control 
Limits % 


Initial Calib.-^-^ Continuing Calibration- 






True 
Value 


Found 


%R 


True 
Value 


Found 


%R 


Found 


%R 


Method 


!. Lithium 


90-110 






















i. Magnesium 


90-110 























i. Manganese 


90-110 





















i. MercurY 


80-120 


________ 





_— 














'. Molybdenum 


90-110 





















1. Nickel 


90-110 


—___—_- 


________ 


____ 














1. pH 


0.24 

























1. Potassium 


90-110 




__-_-_ — 


_—- 














.. Selenium 


90-110 




■" 

















1. Silica 


90-110 


-___„. 





















i. Silver 


90-110 


_______ 




















,. Sodium 


93-107 


-__——. 



















Solids 
i. 180 "C 


93-107 


__ — — — 



















i. Sp. Cond. 


93-107 


— _ — -~i 




















'. Strontium 


90-110 
























i. Sulfate 


90-110 


-__——__- 



















1. Turbidity 


90-110 

























1. Vanadium 


90-110 




. 





















.. Zinc 


90-110 

























!. Tin 


90-110 


______ 



















Includes resu 
True Value) x 
^Initial calit 
Continuing ca 


Its from h 

100. 
)ration sou 
libration 


oth lower 

LTCe. 

source. 


and uppe 


r hal 


7es of cal 
1-10 


ibration 


range . 


%R = ( 


Found 





Lab Name. 
Date 



Form 6 



QC Report NO. 



BLANKS 



Batch No, 
Field .ID. 
Lab ID 



:onstituent : Calibration Blank Cone. : Preparation Blank Cone. 



1 -1 1 



FORM 7 



( OnXilTERL'n 



LABORArORI NA>CE_ 
DATE 



IC? LIIJUlK hangzs 



IC? MODEL NUMBER 



Analyte 



True 
Concentration 



Measured 

Concentration 



Percent 

Difference 



I. Bariua 



2. Berrylliua 



3 • Ca daiua 



4. Calciua 



5. Cobalt 



6. Copper 



7. Iron 



3. Lead 



9. Lithium 



10. Ma^esiun 



II. Manzanese 



12. Molybdeaua 



13. 



Silica 



14. Sodiua 



15. 



Scrantiun 



16. Vanadiua 



17. Zinc 



Cooaents 



Lab Manager 
1-12 



F3RM a 



yuARTZRLT VZailTCAIION OF IC7 IKTIRZLZMZKr COIUIZCTION FACTORS 



LA3 NAMS:_ 

DATE 



CocQOund 


Vavelengch 


Truel/ 
Value 


Initial^/ 
Observed IH 


Flaal2,3/ 
Observed 


n 


tiazzla: 

1 . Barlua 








- 






2. Berylliua 














3 . Cadnlua 














4. Calclua 














2. Cobalc 














6. Coo-oer 








■ 






7. Iron 










, 




8. Lead 














9. Llrhlua 














10. KagaesiuTB 






^ 






■""— ■ 


11. Manganese 














12. MolTbdenuo 














13. Silica 














14. Sodiun 












15. Scronciua 












16. Vaoadiua 








_ 






17. Zinc 















CoBaencs : 



1/ 



True value of EPA IC? Interference Check. Saaple or concraccor standard. 



2^ Refers to Initial concentration reading and a final concentration reading If th 
were necessary in intereleaent correction factors 



3/ 



List signigicant changes (MOZ) , if any, in i ntere leaent correction factors imd 



1-13 



Q. C. RZPORT HO. 




Listed control limit-s (RPD) apply to values greater than 20 CRDL 

So I Tt -'d';(T."o'/2)]%'1oo°"''°' '"' non-calculable RPD due to value(s, 



(cooc'd) 




use and JC ea iadlcaEe ou6 of concrol and non-calrnl aSi - snn j 

less Chan ClOL respective!/. noQ-calculable RPD due Eo ^aiue(s) 

^?3 - C[S - D|/((S + D)/2)J 36 100 



1-15 



Fona 10 



QC Report No. 



SPIKES 



Laboratory name. 

Date 



Batch No, 
Field ID. 
Lab ID 



Constituent 



Arat, Analyte Added 



Sample Cone. 



Alujainuja 



Antimony 
Arsenic 



Percent R 



ecover" 



Bari 



urn 



Beryllium 




Bromid 



Cadmium 
Calcium -. 
-hloride 
-hromium. 

lobait 

-opper^ 

-yanide 

'loride-- 

^dide 

'-fon 

;.ea^d 

i'lth i lom - 



1-16 



Form 10 (cont, 
SPIKES 



Constituent : A:^^. Analyte Added : Sample Cone. : Percent Recover 



Manges ium 



Manganese 



Mercury 



Molybdenujn 



Nickel 



Potassium 



Selenium 



Silica 




Strontium 
Sulfate — 




1-17 



Appendix 2 
Principles of Environmental Analysis 



2-1 



AnML Ctmm. ma. 55. 2210-2218 



Principles of Environmental Analysis^ 



AUTHORS 



Lawrence K. Keith. Chairman 
P-adian Corporation 

Warren Crummet: 
Dow Chemical Company 

John Deegan, Jr. 

Umverair/ or Illinois at Chicago 



Robert A. Libby 

The Procter i Gamble Company 

John K. Taylor 

National Bureau of Standards 

George Wentier 

The Procter it Gamble Company 



Memben 



ACS COMMITTEE ON ENVIR0^OLE^^TAL IMPROVEMENT 

Consultants 



Nina I. McClelland. Chairman 
National Sanitation Foundation 

Robert .\. Saker 

U.S. Geological Survey 

William Beranek, Jr. 

IndianiDolia Center for Advanced Research. Inc. 

Joan Berkowitz 
Arthur D. Little, Inc. 

Geraldine V. Cox 

Chemical Manufacturers Aaaociation 

William H. Gla^e 
University of Texas at Dallas 

Glen £. Gordon 
University of Maryland 

Ronald Hites 
Indiana University 

Lawrence H. Xeith' 
Radian Corporation 

Robert Libby* 

The Procter it Gamble Company , 

Robert Pojasek 
Weston Consultants 

Ruth Rack 

General Motors Research Laboratories 

Glenn Schweitzer 
Environmental Protection .Agency 

PREFACE 

Reliable analytical aeasurementa of invironmental lamples 
ars in essentia] ingredient of sound decisions involving cany 
facsts of society inciuding aafeguarding the public heait.H, 
improving the quality of the environment, and facilitating 
advances la technology. In September 1973, the .Ajaencan 
Chemical Society's Committee on Environmental Improve- 
ment iCEI) directed its Suocommittee on Environmental 
Aoaiytical Chemistry to develop a set of juideiines that would 
help improve the overall quality of invironmental analytical 



'TlTia J a .-evision to th« "Guidtlines for Dau Acquisition 4nd 
Oau Quality Evaluation in E-nvironmenui Chenucry". *nicn ip- 
p«ai>e<S Ji Anal. "'le^. 1380. iZ. 22*2-:Cl9. 

° Member. C3I Subcommittee on Environmental .Monitorinj ii 
Ana^vsis. 



Frances L. Estes 

Consultant for Environmental Sciences 

James P. Lodge 

Consultant in .Atmospheric Chemistry 

Associates 

Donald F. Adams* 

Air Pollunon Control Association 

Joseph A. Cotruvo 
Environmental Protection .Agency 

James P. Minyard, Jr. 

Mississippi Sute Chemical Laboratory 

John Teasley* 

Environmental Protection .Agency (retired)- 

George E. Wentier* 

The Procter <St Gamble Company 

Subcommittes Members Not on Committee 

Miciiael .M. Reddy* 
U.S. Geological Survey 

John K. Taylor* 

National Bureau of Standards 

Committee^Staff Liaison 

Susan Moses 

Department of Public .Affairs 

•American Chemical Society 

measurements. In ISSO, CZI published. "GuidelLaes for Data 
.Acquisition and Data Quality Evaluation in Environmental 
Chemistry* {!). 

In 1982 CEI decided to revise and update the 1950 pub- 
lication and charged the restructured subcommittee, now 
named the Subcommittee on Eavironmentai .Momtonng and 
Analysis, with this task. 

This new publication provides principles useful for many 
diverse applications. Its intent is to aid m the evaluation oi 
the many options available in designing and conducting 
anaiyxicai measoremenLs of environmental iampiea and in t.he 
intelligent choice ot these th.at ■m\\1 meet the requirements ot 
the situation at hand. These situations range from semi- 
quantitative screening anai>-3«3 to those involving stnct quality 
assurance programs .ntended to document the accuracy of 
data for reg'iiatory enforcement or legal purposes. Ln t.his 



2-2 



ANALYTICAL CHEMISTSY. VOL 55, NO. 14. DECc-Meefl 1983 



respect, these principles are a set of juidelinea for raajcinj 
decisions— decisions for piannin? ind executing inalyticai 
work or, ojnversely, for evaluating ihe useruir.ess of previously 
generated inai>ticai oieasursments for a current seed. 

I^TTRODUCTION 

Tbe Pnncioies cf Environmencai .Analysis presented here 
ire organized iccordinj w a general operauonai model for 
conductuig analytical aeasurementa ot environmental sam- 
ples. This document identiiies elements needed to obtain 
reliable data iS veil as factors that have been shown to pro- 
duce unreliable measurements. Tne prinapiea. wfajch appear 
in boldface c>'pe. are intended to encompass broadly the needs 
for both organic and inorganic measurements, although spe- 
cific requirements for these two general categories differ 
widely. Thus, these principles are not a *reapa* for coq- 
ducnag specific detenmnations. They are, however, intended 
to provide guidance in id8ntif>-ing important elements in an 
analytical protocol that are necessary to meet the requirements 
of a specific need. 

Many options are available when analyang environmental 
samples. It must be recognized at the outset that differing 
degrees of reliability, dictated by the objectives, time, and 
resources avaiiabie, influence the protocol chosen to meet the 
"requirements of the problem at hand 

Analytical objectives for environmental samples differ from 
those for many other types of samples because reliable mea- 
surements at very low levels are frequently required Often, 
specific analytas need to be measured at the parts-par-biilion 
and even parts-per-trillion levels in complex matrices. .-Xd- 
vances Ln analytical methodology continue to lower the levels 
at which reliable measurements can be made and conversely, 
requirements for lower measurement levels provide an in- 
centive for future analytical advances. A.t these levels, many 
factors that are of little or no concern in other aLnalytical 
measurements are of critical importance in influencing the 
outcome and reliability of environmental analyses. 

PLA>r>aNG 

Good planning is an essential principle of enviroa- 

mental analysis. Inadequate planning will often lead to 
biased, meanicgiess. or unreliable results; good planning, on 
the other hand can produce valid results. The intended use 
of the data should be addressed explicitly Ln the planning 
process. Intended results are those that answer a question 
or prQ«ide a basis on which a decision (for example, to adjust 
a process or to take reg-ilatory action) can be made. The 
objective of planning is to define the problem and analytical 
?n:gram well enough that the intinded resviits can be achieved 
efficiently and .-eliably. 

It cannot be assumed that the person re- 
questing an analysis will also be able to define 
the objectives of the analysis properly. Nu- 
aeroua discussions between the analyst and those 
who will use the results may be necessary 'intil there 
13 agreement on what is required of -he anal>-3ts, how 
the .-esults will be 'ised. and what the expected re- 
suits may be. The analy-ticai methodology must meet 
realistic erpectauons regarding sensitivity, accuracy, 
reliability, precision, interferences, matrix effects, 
limitations, cost, and the time required .''or the 
anai'/sis. 



Miqn 



One of the 3o»t important element* in planning is to in- 
corporat» esMntiai decision cntara inw the overall maiytical 
protocoL The protocol, wruch descr.bea the inaiyticai process 
ia detad. snouid include the objective, i description of the 



Assurance 
Elton 



Lov. 




Hiqn Conlidanea 
M«ouim Ccnfiaanc* 
Law Con(ia»nc« 



Anaiyta Concentrsiioo 



tiqn 



Pqur» 1. Oualrtativo rstatlonsnio at confidonca. carcantraoon. and 
quality assurance alfcrt Tho quality assurance aftort ncreasas at a 
(aster rate as T>a nooa lor higper confiCooce mcrsasaa and T<« anaiyta 
conceniraDon decreases. Sventually a point is reaoiad ■»nefe na 
quality assurance eltort ncraasas rapidty as 'a^w anaJyta cancan- 
Tation is aooroacned and also wnara (urtfier quality assurance artort 
produces qiminisnmg ratums wrtft increasing confiCanca. 

quality assurance and quality control requirements, the sam- 
pling plan, analytical methodfs), calculations, and documen- 
tation and report requirements. 

Selection of the optimum analytical method is one of the 
most important factors influencing the reliability of resulting 
data (2). In addition to the obvious limitations of availability 
of equipment, amount of sample, time, and resources, other 
factors significantly affect the cost and reliabilir/ of the data 
obtained These are decisions pertaining to: 

(i) TTie level of confidence required regarding the anaiyte'a 
identity. For example, is a low or a high level of confidence 
for correct identification required? The latter may be achieved 
at higher e.kpen3e by confirmatory analj-sis with an inde- 
pendent measurement technique. .\ lower confidence level 
can be achieved at less expense by comparing the analyte's 
spectral, criromatographic, or other physical/chemical prop- 
erties wich values reported in the literature. 

(2) 77ie analyze leueU [boch qualitative and quantitative) 
that need :a be measured. This decision often defines method 
selection, the amount of a sample to be talcen, the degree and 
r,-pe of sample pretreatment i,e.g.. cleanup and concentration) 
to employ, and the method of analysis. Generally, the lower 
the levels oi measurement that are required, the higher the 
cost o{ doing the analyses. 

(J) The degr'.e of confidence leeded {Fifurt 1). This 
decision will influence method selection and t.he number 3f 
samples taken is well as t.he design of the qualir/ assurance 
program. La general, the higher the degree of precision and 
accuracy needed, the more r-.gorous the quality assurance 
program must be and the higher the costs oi analysis. 

(•<) The decree of -nethod validation that li riecessary. On 
the basis of the specifications developed in the first three, 
items, the method must now be examined to determine 
whether it actually can produce the degree zi specificity, 
precision, and "acoiracy* required If it dees r.ot, then eit-her 
the method must be improved or another method muse be 
chosen. The rirst stage of method validation ordinarily will 
involve only intralaboratory validation. If more than one 
laboratory will ba i.nvoived in a measurement program le.g., 
multilaboratory monitoring and sur-zeiilanca it regulator;^ 
actions), interlaboratory validation may be required 

(o) The defrtt of quality assurance :hat :s necessary. Tr.e 
use of validated methods, reference '.aboratories. and expe- 
rienced personnel jtiU does not assure the production of re- 
'iable anaiyscai results. .All analyncai work must be morutored 



2-3 



ANALYTICAC CHeMISTRY, VOL. 55. NO. 14. OeCEWeEH 1983 



Analyt* Isolalion 



H 



Conceotradon 



Fracttonaiion 



, 


Owivailzallon 






S*oarattan 

1 




t 


IdantlHcatlon 




r 


Quantitation 



Rfluf* 2. TecMntcai deciaioo framework. 

by a 3y3tem of quality aasurance to verify that the results 
obtained have a hi?h probability of bein? correct. The use 
of rugged methods, experienced personnel, and a history of 
low outlier production will justify a lesser degree of control 
than will the use of complex new procedures and inexperienced 
personneL However, the most important factor determiBing 
the level of quality control is the consequences of being wrong. 

If an analytical result is to be used La a screening program 
or to adjust a process parameter, an unvalidated analytical 
method may be sufficieut and appropnate. On the other hand, 
if regulatory compliance is the reason for an analysis, a va- 
lidated analytical methcd (especially one that has been upheld 
in the courts) is usually required. In addition to making 
decisions in a cost-tjffectiveness context, a separate set of 
decisions must be made 'viLhin a technical decision framework 
(Figure 2) (2). 

In selecting an analytical method, consideration also must 
be given to verifying that the met.hod actually measures the 
analyt«9 in question. A second analytical method is often used 
to comlrm that the intended analyte is being measured by 
the first analytical method. For example, gas chromatogra- 
phy/ mass spectrometry iGC/MS) is often used to contlrm 
a gas chromatographic (GO measurement. 

QU.^LITY A3SLRANCE AND QUALITY 
CONTROL 

.\ quality jssurance program is an eaaentiai part of a sound 
analytical protocol and jhould be used by individuals as well 
as by laboratory organizations to detect and correct probiema 
in the meaaurement process or to demonstrate attainment of 
a state of statiaucal control The nbjective of quality asaurance 
programs for analytical measurements is to reduce measure- 
ment ermrs to jgreed upon limits and to rfaaure that the result* 
have a high probability of being of acceptable iiuaiity. 

Two concepts are involved in quality assurance: (Quality 
control, the mechanism established to control errors; and 
quality assessment, the system used to verify that the ana- 
lytical proceaa is operating within acceptable limitx Quality 
aaiuranca of chemical measurements .s reviewed in some detail 
in ref 3. General handbooks that discus* quality assurance 
u% given in ref *-6. 

Each laboratory ihould h«v« and us« • quality sik 
*4ruic« procraa. In addition, every monitoring program 



should contain an appropriate quality isaurance plan and 
require that it l>e followed -itrictly by all participants. In each 
case, the quality assurance program and plan should be de- 
veloped as a joint effort by all involved personnel, including 
statisticians as required. 

The elements of a quality control program include the 
following: development of and strict adherence to principles 
of good laboratory practice; consistent use of standard op- 
eration procedures; and establishment of and adherence to 
carefully designed protocols for specitlc measurement pro- 
grama. The consistant use of qualified personnel, reliable and 
well-maintained equipment, appropriate calibrations and 
standards, and the close supervision of all operations by 
management/ senior personnel are essential components of 
a sound quality control system. VVhen properly conceived and 
executed, a quality control program will result in a mea- 
surement system operating in a state of statistical control, 
which means that errors have been reduced to acceptable 
levels and have been characterized statistically. 

Qiialicy assessment descnbes those techniques used to assess 
the quality of the measurement process and the results. Th« 
establishment of a system o( control charts is a basic 
principle. Control charts are plots of multiple data points 
from the same or similar samples or processes vs. time (9). 
They are used to determine if a system is in a state of sta- 
tistical control Control charts should be used to visualize or 
monitor the relative variability oi repetitive data. Control 
charts also can be used with reference materials, spiked sam- 
ples, and analysis of surrogates as a means of assessing the 
accuracy of measurements. 

The attainment of statistical control is the first re- 
quirvment that must b« met before assessment of accu- 
racy can be made. For this reason, and also to monitor 
ongoing precision, the development and maintenance of ap- 
propriate control charts are essential features of the quality 
assurance process. Control charts should be maintained and 
'jsed in a real-time mode to the extent possible. The strategy 
of the decision process in their use and the corrective actions 
to be taken when lack of control ls indicated should be planned 
and followed. Because all data obtained within the period 
of "last-known in control* and "first-known out-of-control' 
are suspect, laboratories must consider the nsk involved when 
designing their quality assurance procedures (iO). 

Statistical cntena should be 'ised in designing quality as- 
sessment programs, which include the kind oi test samples 
to be used and the sequence of use, so that the establishment 
oi statistical controb and an estimate of bias can be nxade {ID. 
.Appropriate statistical concepts should be used in the 
design of the quality assurance program. 

Audits should b« a feature of all quality assurance 
programs, .A svstem.t audit should be made at .ipprnpriate 
intervals to assure that ail aspects of the quality jsaurance 
program are operative. Performance audic.i, in which a lab- 
oratory is ev.iliiated hased on the results of .innK'ea of hiind 
itandard samples, also provide valuable quality H.-sessment 
information. PirticipHtion in interlabomtory uid idllaboruive 
test programs is another procedure a laboratory may 'i.se to 
assess the quality of its data output 

The terms repeatability, reproducibility, nur.ilaboratory 
variability, and interlaboratory v.iriability ire 'erms -.ometimes 
used til describe various Lspects of the meHsurement □rin.e**. 
Repeatability describes the variation in data ]<tfnerate<i on a 
single sample by a single .nnaiyst .ind, or in.iirument over a 
short period of time while reproducibility rcf(*rs to vnrtation 
over an extended period of time and, or by v.inoua in.ihsts 
or laboratories, Intraiaboratory variaotlity refers to the uit- 
fer«nc» in results when a single laboratory measures portions 
of a common «ampla repeatedly Interlaboratorv vnrialnlity 
refers to the difference of results obtained by different lab- 



2-4 



ANALYTICAL CHEMrSTflY, VCX. 55. NO. M. CECa-tBEa 1983 




True vaiu* 



3ias 



FTgur* 3. An Iliao^Don of Dfeosion and accurae/: (a) jnoiassd 
maasiroi wrtM (1) hi<;fi preosusn (tow (Hsoonion) and (21 low oroosxjn 
(high Sboaraioni: (b) oiasod maasures witfi (1) hiqn praosion (low 
dsparsioni arid (2) low precision {Ugfi disowsioni. 

oratories when measurin? pornooa of a common sample. 
Because of the nonspecific aatvire of this terminology, the 
ezpenmenul conditions ausc be specified whenever 3uch 
terms are osed. 

Variabilities due to different operators, equipment, and 
conditions Cbut independent of sample variabilitiea) influence 
reproduabilicy and are indeed components of iutralaboratory 
variabiiicy. Laboratory biases, when coupled with differences 
ia the capabilities of various Laboratories, result in an inter- 
laboratory variabilir/ that ia often very much Larger than 
intraiaborator/ vahability. Well-designed and well-executed 
quality assurance programs help .-educe both inCalaboratory 
and interiaboratory variability. 

VERIFICATION A.VD VALIDATION" 

Verification is the general process used to decide whether 
a method in question is capable of producing accurate and 
reliable data. Validation is an experimental process involving 
external corroboration by other laboratories (internal or ex- 
ternal) or methods or the use of reference materials to evaluate 
the suitabilir/ of methodology. Neither principle addresses 
the relevance, applicability, usefulness, or legaiir/ of an en- 
vironmental measurement. A review on this lubject is covered 
ia ref 12. Conlrmation, a r^-pe of venfication, is a process 
used to assure that the anai>xe in question has been detected 
ind measured acceptaoly and reliably. 

The reliability and acceptability of eavironmentil 
analytical measurements depend upon rigorous com- 
pletioa of all the requirements stipulated ia a well-de- 
fined protocol. Such protocols ihould prescribe the docu- 
mentation requirements of the study including sampling 
procedures, measurements, venfication. and validation. In 
addition, ail results should be reviewed cr.tjcaily. [f questions 
arise dunng the review, additional confirmatory testa should 
be conducted, including t.He use of methods other than those 
applied previously. In situations where '.arge numbers of 
samples ire inaiy^ed with widely accepted and well-docu- 
mented aaalyticai sysums. unusually high results, or unex- 
pected low ones, on critical samples should be checked by i 
repeat analysis of a duplicate subsampie by using t.he same 
method and a third subsampie analyzed by i different ina- 
lyt3c*i method. .A,gre«ment of the three results indicates "jiac 
the aaaiyte has been measured correctly: disagreement r»- 
qvufw careful saidy of its cause includi.ig anaiysu of iddiaonai 



samples. This requires prior planning for the collection oi 
sufficient amounts of sample at the beginning of the program 
or for resampling if .lecessary. 

Confidence in the measurement process is strengthened 
considerably by 'nteriaboratory compansons and is one of the 
most -jifeciive elements of a quality assurance plan. Col- 
laborative testing; should b« a prerequisite for analytical 
methods used in major decision-making processes. 

.Aj:ai>"tical systems that are part of major deci- 
sion-m.ildng processes should utilize coilaborativeiy 
studied methods. Confidence in the measurement 
process is strengthened considerably by participation 
in check sample programs conducted by external 
organizations. Such programs are some of the most 
effective elements of a quality assurance prograj^i. 

Qualitative identificatioa should be confirmed. This 
confirmation should be based on a measurement prmdple or 
on analytical conditions uhat are distinctly different irom t.hos« 
used in the initial method. The procedure chosen should be 
highly selective and should refer to a different unambiguous 
property that is characteristic of the anaiyte. 

PRECISION A.ND ACCURACY 

Precision describes the degree to which data generated from 
replicate or repetitive measurements differ from one another. 
Statistically this concept is referred to as dispersioa Accuracy 
refers to the correctness of the data. Unfortunately, La spite 
of its importance, there is no general agreement as to bow 
accuracy is evaluated. Inaccuracy results from imprecision 
(random error) and bias Isyrtematic error) in the measurement 
process. As illustrated by Figure 3, high precision does not 
imply high accuracy and vice versa. 

Unless the true value is ksow-n, or can be assumed, 
accuracy cannot be evaluated. Bias can oaly be esti- 
mated from the results of measurements of samples of 
kaown compositioB, Standard reference materials, when 
available, are ideal for use in such an evaluation. Bias is oft^n 
estimated .T'om the recovery of spiked samples. It should be 
remembered that such samples may not fully simulate natural 
samples so the recovery Loformation should be interpreted 
with this ia mind. The accuracy of individual analytical 
measurements is discussed in ref L3. 

Despite the use of validated methods, principles of good 
laboratory practices, and systematic qualiry assurance pro- 
cedures, outlier data points can frequently appear in sets of 
anai>-tical measurements. Outliers are anal>tical results that 
differ so much from the average as to be highly improbable. 
Statistical techniques may be 'issd for their identification. 
Zero or negative measurements are often considered to 
b« outliers, but when working near the limit of dei^rtioo. 
a certain number of analyses by chance ilone are ex- 
pected to be zero [l-l ). '.Vhen outliers are discarded from 
a data set, it is important that they are identified and '.hat 
the statistical or operational reasons for their deletion are 
given. 

SA.MPLI.NG 

The quality and utility of aaalyticai data depend 
critically on the validity of the sample and the adequacy 
of the sampling program. Sampling, including the devel- 
opment Jt sampling plans, is often complex and may require 
special expertise such as statistical input. Guidance for qa- 
veloping acequate programs is contained in :ti 15 and 15. 

The purpose o( sampling is to obtain specimens that rep- 
resent the situation being studied. Sampling plans .may re- 



2-5 



ANALYTICAL CHe.MISTHY, VOL. S5. NO. 14. OECSWeefl 1343 



qmre that systematic samples be obtained at specified times 
and piacea; or simple random samplic? or straniled sampling 
may oe necesaar-v Generally, the sample should be an un- 
biased representative of ihe population oi" interest. This means 
that the sample obtained a related La probability to all other 
samples that could be selected :"rom the target population 
under the stjeoned conditions: tma can only be accompiisned 
through appropnate randomization schemes. 

All aspects ot a sampling program should be planned and 
documented in detail, and the expected relationship of the 
sampling protocol to the analytical results should be detmed. 
A sampling program should include reasoru for choosing 
sampling sites, the numoer of samples, the timing of sample 
acquisition, and the expected level of fluctuations due to 
heterogeneity. These reasons should 'oe based on the decsiora 
made La the pl anni ng phase. .^ detailed descnption of sam- 
pling sites and procedures is necessary and should include the 
sampling methodology, Labeling, container preparation, field 
blank preparation, storage, and pretreacaent procedures. .An 
acceptable sampling program should include at least the 
following: (1) a sampling plan that takes into account the goals 
of the studies and the expected uncertainties associated with 
the number of samples collected and the population varia- 
bility; (2) instructions for sample collection. Labeling, pres- 
ervation, and transport to the analytical facility: and (3) 
training of personnel m the sampling techniques and proce- 
dures specified. 

The quality assurance program should include a 
means to demonstrate that the sample container and 
storage procedures do not alter the composition of the 
sample ia a way that would affect the concentration or 
the identification of the analyte being determined. 
Special transportation procedures (such as refrigeration or 
exclusion of light) need to be specified Lf they are required 
to ensure the integrity of the sample. Detailed guidelines for 
sampling designs in some special situations are available in 
rei S, U, and 15-13. 

Samples should be safeguarded from loss, tampering, or 
misidentiflcation. This re<;uires use of a unique sample 
identification system, safe storage facilities, and a sample 
'Management system in which all steps are fuUy documented. 
Samples that could be Involved in potential litigation 
should b« protected with a chain-of-custody documen- 
tation system in addition to the above safeguards. 

Sampling Requirements. Because environmental samples 
are typically heterogeneous, a large .lumber of samples or- 
dinarJy must be analyred to obtain meaningful compositional 
data. (Measurement of a single sample can teU nothing about 
environmental patterns, but only about the sample itself.) The 
number of individual samples that should be analyred will 
depend on t.be kind of information required bv the investi- 
gation. If an average compositional value is required, a Large 
number of randomly selected samples may be obtained, 
comolned. and blended to provide a reasonably homogeneous 
composite sample from which a sufficient number of sub- 
samples are ar.aiyjed. If composition profiles or the vanabiLity 
of ihe sample population is of interest, many samples will need 
to be collected and analysed individually. 

In jeneral. the number of samples and the quality of the 
sampling procedure must be planned to facilitate character- 
izing the population of interest and enhance the utility of the 
5nal results. If the sampling plan is not imposed (for example, 
by regulation), t.he investigator will need to decide what error 
and confidence leveb are tolerable. Once these are deter- 
mined, the aicimum number of samples neceisar/ for specific 
confidence Limits that satisfy the requirements of the protocol 
can b« estimated. Several approaches for defining the number 
of such sampies may be used. 

A itatistical approach to determine :.he number of samples 




to be taken is possible when the distribution and standard 
deviation of the population are known or can b« asaiuaed. It 
is usually assumed that most chemical data are approximated 
by a Gaussian or normal distribution so that well. known, 
common statistical techniques can be 'osed. This assumption 
is not always correct particularly with the ver%- low concen- 
trations ippb, ppt) encountered in environmental arit.'-^es. 
In such cases, a iog-normal distnbuuon may be more su.u.nie. 
In a log-normal distribution, the logarithms of the coac.»n- 
ffanons follow a Gaussian or normal disc-.buuon (19. 20). One 
consequence of the assumption of a iog-normal distnbut.on 
is that the common arithmetic average is noi a suitable •■'•ay 
to estimate the mean of a set of data. Instead, the geometric 
average (the average of the logarithms) should be used. The 
discussion that follows is applicable to any statistical distri- 
bution once it is assumed. It is, however, important for the 
investigator to verify that the statistical distribution bem; 
used is appropriate. 

.\ relationship that may be used to calculate the required 
number of samples for a given standard deviation and for a 
given acceptable error is 



(1) 



where .V, is the number of samples, i is the value :" ..he 
standard normal vanate (see ref 21. Table .A.-2. page T-3, for 
example) based on the level of confidence desired, .r, i* the 
standard deviation of the sample population, and * »• the 
tolerable error in the estimate of the mean for the --i i^^-rac- 
teristic of Interest, 

For illustration, assume that the sample population is ex- 
pected to have a mean concentration of 0.1 pom with a 
standard deviation of 0.05 ppm and that the tolerable error 
in the stated value of the mean at the 95% confidence level 
(z = 1.96) is not to exceed 20% (0.02 ppm). A further as- 
sumption made is that the measurement error is small in 
comparison with the measured values and can be neglected 
In the calculation. With the above values, the approxksau 
number of samples required will be 

.V, = ((1.96 X 0.05)/0.02]= a 24 (2) 

Unfortunately, environmental analyses often are done where 
the expected leveb and the standard deviation of the popu- 
lation are not known in advance and where the aeasu-eiiien: 
error cannot be predicted adequately, nor can it be assumed 
to be negligible. Nevertheless, the measurement error aus: 
be estimated from the analytical results. In this case, the 
measured values of similar samples can be 'jsed to calc'.ilate 
an overall standard deviation, j„ which is related to che 
standard deviation of measurement, <t^, and the standard 
deviation of the sample population, y,, by the expression 



+ V 



',0- 



.An estimate of j,, can be obtained by a pooling process, 
using the differences in the measured values of duplicate 
samples isee ref 22, p 316). Then the population standarc 
deviation, 3,, can be estimated. Unless such cairuiations are 
based on a suff.cient number of measurements lat ^east seven;, 
the standard deviations may be significantly overesti.-nated 
or underestimated. To minimize tha problem, the appropriate 
value of the Student's : distribution (ref 21. Table A--4. page 
T-5) should be used and c values should be substituted for 
r in eq I and similar expressions. 

An equation oi the form of eq I may be used also to esf:i„« 
the number of replicate measurements, S^, required h 3 
homogeneous sampie '.o acnieve a mean value within a given 
confidence interval £ 



2-6 



ANALYTICAL CHEMiSTSY. VOL. 55. NO. U, CSCc.V<aeH 1933 



^.'[■■f 



(4) 



In ihla case, j^ .-epfesenu the jLindaxd deviation of the 
measuretnenc process. Unless .V^ u large, the 5tudent'3 : 
distribution should be used in place of the sunda/dized 
normal vanate :. 

.An empincai approach to iamplin? is used sometunea. For 
example, the iV-:V-:V concept (13) has been suggested where 
equal numbers (.V) of samples, blanks (a notreatmen: control 
sample), and spiked blanks are to be analysed. This concept 
was first used in U.S. Department of .Agncuiture pesticide 
residue studies aa the *10-tO-LO rule". In an extensive 
monitoring program, the number of field blanks ibLanks from 
a similar source t.hat do not contain the analytes of interest) 
and blanks spiked with the anai>tes at known concentrations 
can be substantially less than the number suggested bv the 
.V-:V-;V approach. 

Blanks. If field blanks axe not available, every effort 
should be made to obtain blank samples that best sim- 
ulate a sample that does not contain the aaalyte. In 
certain circumstances, a simulated or synthetic field blank 
is the only alternative. 

In addition, measurements should be made to ascertain 
whether and to what extent any analytical .-eagents or solvents 
used contribute or interfere with the measurement results. 
Good Laboratory practices are required to control the level and 
variability of artifacts and interferences in reagent and solvent 
blanks, and a siifficient number of reagent and solvent blanks 
must be measured to evaluate potential interferences with 
sufficient confidence {23, 24). 

Us« of Control Sites. ^Vhen environmental measurements 
are made to investigate localized contamination (e.g., at a 
hazardous waste site or a point source discharge), measure- 
ment of concentrations at sites recognized as uncontaminated 
(control sites) is usually required. Such sites and the numbers 
of samples to be analyred must be chosen carefully and be 
adequate to establish clearly the significance of any apparent 
differences that may be indicated. As the concentration levels 
in samples from test and control sites approach each other, 
the analysis of control site samples becomes increasingly im- 
portant. The planning of enNironmental anai>-tical programs 
and interpretation of the results of such momtonng programs 
will require utilization of statistical principles. 

Field Control Samples vs. Laboratory Control Sam- 
ples- The recovery of spikes is '.ised frequently to evaluate 
analytical methodology. Uncontaminated samples from 
onntrol sites that have been spiked with the analytes of Interest 
provide the best Lnformacion because they simulate any matrix 
effects. .Although commonly referred to is "spiked blanks", 
when a Iciown amount of a reference material is spiked into 
a Liquid, it becomes a "standard soluuon"; so "spiked blank" 
is incorrect terminology and its 'jsa a discouraged. Simulated 
or synthetic field blanks may be spiked wnen suitable blanks 
from :ontrol sites are 'unavailable. »Vnen feasible, .sotopically 
Libeled analytes spiked into samples provide the greatest 
accuracy since they are subjected to t.he i^me .matrix effects 
as the anaiyte. Exceptions may occor with some solid samples 
(including biological samples). 

The object of spiking a blank or a sample "in the field* (field 
control samples) at the time samples are collected v-s. spiking 
a sample or blank in the laboratory 'laboratory control sam- 
ples) pror to analysis is to determine if there are any matrix 
effects caused by time and/ or the conditions under wrucn the 
lampie is taken, transpor.ed, and stored prior to actual 
analysis. .Although field control samples are a 'iseful t>-p« of 
spike, they are alio the most difficult to prepare because of 
special technical procedures that must be aesised to preoare 



them accurately. Because of this, most spikes are made after 
samples have been returned to a laborator>". possibly hours, 
days, or even longer periods of time after the samples were 
taken. 

The recovery of both laboratory and field control samples 
must be interpreted with due consideration for possible dif- 
ferences oi behavior between naturally incorporated sub- 
stances and tiiose added artuicially. For example, if in anai>xe 
is strongly associated with any other component in the sample 
matrix, it may .lot be possible to recover it as efficiently as 
from the appropriate blank matrix that has been spiked and 
then immediately extractecL 

MEASUTlE.ME>n'S 

Measurements should be made with properly tested 
and documented procedures. Furthermore, every labora- 
tory and analyst must conduct sufficient prelimmao* testa, 
using the methodology and typical samples, to demonstrate 
competence Ln the use of the measurement procedure i 12). 
Procedures should utilize controls and calibration steps 
to minimize random and systematic errors. vVhen pos- 
sible, procedures should provide the required precision, 
minimum artifact contamination, and the best recovery 
possible. Contamination can be introduced from sampling 
containers, equipment, reagents, solvents, glassware, atmo- 
sphere and added surrogates, or internal standards. Keeping 
the number and complexity of operations to a minimum will 
lessen contamination possibilities from these sources. 

To establish confidence in the anal>tical measurements, 
data obtained on sensitivity, accuracy, precision, and recovery 
should be comparable, when possible, to literature values 
obtained on similar problems. "State-of-the-art" analytical 
techniques are not always necessary but t.he results and 
methocls used should be able to stand the test of peer .-eview. 

OefinitioQ of the Data Set, Environmental analytical 
measurement programs should be designed to obtain quality 
assurance data from the following types: calibration standards; 
field samples: field blanks; spiked :":eld or laboratory blanks; 
and reference samples. Field sample data are the objectives 
of an Investigation. Field blanks are necessary to account for 
the presence of spurious anal>tes, interferences, and back- 
ground concentrations of the analyxe of interest. Spiked field 
or laboratory samples are required to establish recover/. 
Reference samples are required is specified Ln the quality 
assurance program and project plans to document the quality 
of the field sample measurements. Calibration standards are 
required to provide a basis for quantitating the anai>-te3 of 
interest. The frequency and order for measu.-ing a sequence 
of these various samples and blanks should be defined Ln the 
protocols developed in the plan.iing stage of the program. 

Preparation of Samples. .After a sample has been ob- 
tained, the inalytical protocol jften requires one or more 
treatments pnor to actual measurement of the inaivtes. 
Sample preparation may involve physical operations suc.i as 
sieving, blending, crushing. dr,ing, and, or chemjcai ope.-ations 
such as dissolution, extraction, digestion, fractionation, de- 
nvatizauon. pH adjustment, and the addition of preservatives, 
standards, or other materials. These ph>-5ical and chemical 
treatments not only add complexir/ to the inal>tical process 
but are potential sources oi bias, variance, contamination, and 
mechanical loss. Therefore, sample preparation should ba 
planned carefully and documented in sufficient detail to 
provide a compieu record of the sample history. F'lrther, 
samples taken specjicaily to test the quality assurance 3>-stem 
(i.e., quality assurance samples) should be subjected to thesa 
same preparation steps. 

The anai>'st must recognize and be aware of the risks that 
are associated with eacn form of pretreatment and take ap- 



2-7 



ANALYTICAL Chew I sraY. VOL. 5S. NO. u, cecs!^se^ isaa 



prooriata preventative action for each. This taay mciude 
reporting, correctic? for. or poasibiy removtn? interferences 
from the inaiytes of L-.tereat by acxlLf'.nng the protocol All 
changes m the protocol ausc be docuxaenteci. 

Caiibracion and Standardization. Calibration \3 the 
process for detarsinir.? :he correctness of the assigned vaiues 
01 the physical jtar.darGs 'osed or the scales of the neaaunr.g 
Lracaments, T',.-ciC3i caiibrationa include standaras for mass. 
volume, and length; also, LnacruiaenLs thac aeasure temper- 
attire. pH, and chemical compoair.on caa be calibrated. The 
term standardization is 'jsed fre<5uencly to deacr.be the de- 
terminadon of ihe .-esponse ninction of analytical insmimenta. 

Calibradon accuracy is crincaily dependent on the reLiabdicy 
of the standards used for the required iatercomparisons. 
Likewise, chemical calibrations or standardizarion depends 
critically upon the qualir/ of the chemicals used to provide 
the aecssaary standard solutions and the care esercsed in their 
preparacon. 'iVhere possible, calibracon shouid be performed 
by suitable regression analysis of the net signal on the anaiyta 
concentration. At least tlire« different concentrations of 
calibration standards shouid b« measured Ln triplicate, 
but mors than three different concentrations are rec- 
ommended. The concentrations of the calibration standards 
must bracket the erpected concentration o: the analyte m the 
sampiea. No data should b« reported beyond the range 
of calibration of the methodology. T^a calibration data, 
when ploctad graphioily, are referred to as a calToranon car/e. 
The calibration must be done 'oader the same instrumental 
and chemical conditions as those that ■vtll exist during the 
measurement process, and the frequency of caii"br3tion de- 
pends on the accurae/ requirements of the Laveatigation and 
the stability of the Instrument used for the measurements. 

Interr.al standardiiation involves the addition of a reference 
matenal to t.he sample. EztemaJ standardisauon involves use 
of i refsresce material separately. The internal standard 
material is chosen to simulate the anaiyta of Interest; the 
external standard material is usually the analyte being mea- 
sured. The ratio of the response of the internal standard to 
the analyte re3por.3e is called the response constant and is used 
to calculate analyte concentration. 

Surrogates are sometimes incorrectly termed internal 
standards. Compounds cioaeiy related chemically to the an- 
aiyta of concern may be used as spikea: La this document, this 
technique is Iciown as "surrogate spiking*. The difference 
is that an Litemal standard Ls the reference material against 
■which the signal from the analyte j compared directly whereas 
the signal from a surrogate is not used directly for quantita- 
tion. Both r>-p«3 of reference materials are chosen to simulate 
the analyte of interest. Surrogates may be used indirectly for 
quantitation of anai>-tes as, for example, Ln determining re- 
covery efficiency during sample pretreatment. 

.Anot.her technique is one of 'standard addition* where 
successive, increasing 'ir.own amounts of analytea are added 
a the sample or aliquots of it. In each oi these techniques, 
the spiked sample is analysed 'jnder exactly the same con- 
ditions as the actual sample, including all phases of pre- 
nreatment, Recovenes of the analytes of interest are inferred 
from thosB found when rising spiked materials. 

It '-3 essential to demonstrate t.hat the spiking procedure 
is valid. It must b« shown either that the spiked chemicals 
equilibrata 'vith the corresponding endogenous ones, or 'hat 
th« recovery of the spiked chemicals is '..he same as the re- 
covery of the endogenous c.hemicais. within experimental error, 
over t.ht .Till range of concentration levels to be anaiyisd f24). 
Racovory. Recovery of analytes ;3 mxluenced by such 
factors ia concentration of the inalytea. sample aatnx, and 
timt of storage. Becausa recovery often varies with concen- 
trauor. the spika and the anaiyta concaotrations shouid 
b« as ciosa ai practical. '-Vhen tha spike and, or analyte 



concentrations are close to the background concentration. 
recoveries can be highly variable. 

.Matrix effecta can cause wide vartability 'Ln recoveries, es- 
pecially with organic compounds. Therefore, to be valid. 
recovenes of a spike standard must b« determined in the 
same matrix as tha sample. It is not unusual to ilnd sig- 
nificant recover/ diffsrencea for organic stancaris spiked u-.to 
samples of Lndustnal wajtawaters t.hat have been taken only 
days ior even hours) apart because tae composition of 
wastewater samples often changes significantly with time. 
.\nother consideradon is the amount of time a sample has been 
stored before sample pretreatment and analysis. Analyring 
a sample that has been stored for a long period oi time will 
sometimes give different values for the analytes than when 
the sample is analyzed while fresh. This difference may be 
caused by changes in the matrix and/or the analytes. 
Therefore, if it Ls anticipated that a sample Ls going to be 
stored for an appreciable period of time before iBalysis. it 
should be demonstrated that significant changes have not 
occurred. '■Vhat an 'appreciable period of time" Ls depends 
on the analytes, sample matrix, and other chemical and 
physical factors that are determined Ln the planning phase. 
.Am appropriitely designed quality assurance program caa 
assist m determining the potendai effects of storage time on 
analyte loss and wiil specify maximum holding times 39 
necessary. 

Variable recovery Ls sometimes resolved by using the isotcca 
dilution technique La which isotopes of the anal>-tes being 
measured are spiked directly into the sample. Both radio- 
acdveiy and nonradicacdveiy Labeled compounds or elements 
can be used- This principle is based on t.he aasumpdon that 
the Labeled compound or element will behave Ln an essentially 
identical manner to the unlabeled analyte of interest. 

Generally, ar.alytlcal values should ba reported as measured 
(uncorrected for recover/) with .Tail and complete supporting 
data involving recovery experiments. If t.he measurements 
are reported as 'recovery-corrected', all calculations and ex- 
perimental data should be documented so tha: the original 
uncorrected ■••alues caa be derived Lf desired. In carrying out 
recovery srudies. the anal>-3t should reccgnize that in analyte 
added to a blank sample may behave differently (tTpicail/, 
showing higher recovery) than an analyte in a field sample. 
In such a case, the method of staadard addition tends to lead 
to erroneously low values. Whenever possible, testing should 
i.-iciude experiments on homogeneous working standards 
containing known amounts of naturally incorporated anal)-te. 
Unfortunately, tne frequent lack oi such environmental 
standard reference materials is a major limitation Ln analyses 
and is a more acute problem with organic standards than with 
inorganic or elemental standards. 

Interferences. Because oi the complexity of environ- 
mental samples and tine limited seiectr/ity of most metho- 
dologies, interferences are common dunrg analysis. Ap- 
propriate controls and experiments must b« included to 
verify that interferences are not present wuh the ana- 
lytes of interest or, if they are. that they be removed or 
accommodated. Interferences .hat are not accommodated 
causa incorrect analytical results, jiciuding false positives and 
false negatives. 

I.ntarferences arise from two sources that may occur )i- 
multaneously: constituents t.hat are ;n.herent Ln the sample; 
and artifacts or contaminants chat have been introduced 
dunng the analytical process. .\ good aeasuremeat plan 
and quality assurance program, incorporalinj an ap- 
propriaca experimental desi^ and field and method 
blanks, is critical to identifying th» sources of inter' 
ferences in a sample and diffarentiaung betweaa in- 
terferences and artifacts. I.nterferences are jer.erally re- 
moved by modifying t.ne methodology. Artifacj are avoided 



2-3 



ANALYTICAL CH£MtSTnY. VOL. 5S. NO. U. CECEMBEB 1383 



Tata) Signal . 

iS,) 



tJ» 



LOO 



Loa 




CudfltitafMft 




2 I « 3 

I 
IS, . }«l 



I 9 10 II tl 1] 14 

I 



(In uiwta o< «l 

Flqur* *. RotaoonsMio of LCD and LOQ to signal strsngm. Th« LOO 
is kscated 3<r aoovo 9^a measured averaga Oiftersnca in O^a toui (S.) 
and sUnK (5^ signals: tn« LOQ is 10? aoova 5^ 

or identified by an appropriate, comprehensive quality as- 
surance program. 

Limit of Detection (LOD). The limit of detection (LOD) 
is derlned as the lowest concentration level that can be de- 
termined CO be statistically different from a blank. The 
concept is reviewed in ref 25 together with the statistical basis 
for its evaluation, .additional concepts include method de- 
tection Limit (MDL), which refers to the lowest concentration 
of analyte that a method can detect reliably in either a sample 
or blank, and the instrument detecuon limit (IDL), which 
refers to the smallest signal above background noise that an 
instrument can detect reliably. Sometinea, the IDL and LOD 
are operationally the same, la practice, an indication of 
whether an analyte is detected by an instrument is sometimes 
based on the extent to which the analyxe signal exceeds 
peak-to-peaJt noise. The IDL and especially the MDL are 
important parameters for comparing and selecting instru- 
mentation and methodology. The experimental determination 
of the MDL is discussed m ref 26. 

The question of detection of a jiven analyte is often one 
of the most important decisions in low-level analysis. The 
question that must be answered is whether a measured value 
is signinrantly different from that found for the sample blank. 
Let S, represent t.He total value measured for the sample, S,, 
the value for the blank, and u the standard deviation for these 
measurements. The anal>"te signal is then the difference S, 
- 5y It can be shown that for normal distributions 5, - S|, 
> at the 99'7o confidence level when that difference (S, - 
St,) > Zd. The recommended value of LOD is 3<r. LOD is 
numencally equivalent to the MDL as S,, approaches lero. 

Limit of QuAntitation (LOQ). The limit of quantitation 
(LC-Q) Is denned as the level above which quanutative results 
may be obtained with a specirled degree of confidence. 
Confidence in the apparent analyte concentration increases 
as the analyte signal increases above the LOD. The value for 
LOQ » 10<r is recommenced, corresponding to an uncertainty 
of =:30"o in the measured value (LOir = 3<r) at the 99% con- 
tldence leveL 

The LOQ Is most useful for defining the lower limit of the 
useful range oi measurement methodology. This range ex- 
tends from this lower value to in upper value where the 
response is no longer linear and sometimes is referred to as 
the limit of linearity. 

The LOD and LOQ are shown graphically in Figure i. The 
basa icala 3 in 'jruts of standard ievianon of the measurement 
process which ;s assumed to be the same for ail of the mea- 
surements indicated. 

Considerations of the above guidelines for reporting low- 
level data are given in Table L In -osmg the table, it should 
ba remembered that tha concentration '.eveis indicated refer 
to interpretation of jingle measurements. 

Signal* balow 3<r should ba reported as "noc detected" (ND) 
aad '.na limit of detection should ba given in parenthesea: NTJ 
(LOD ■ value). Signals in tha "region of less-certam 



Tabla I. Guidelines for Reporting Data 

analyte concn 
in units of cj 

(S, - JVjJ 

<3<J 



region of reliability 

region of questionaole aetection 
(and therefore unacceptable) 
3<J limit of detection ( LOD) 

3<7 to 10<j region of less-certain quantitation 

lOo limit of quantitation (LOQ) 

> 10a region of quantitation 



quantitation" {2<r to 1Q<t) should be reported as detections with 
the Limit of detection given Ln parentheses. The practice of \ 
using the symbols T" or "tr" for amounts and the term "trace" 
and siimLar statements of relative concentration should be 
avoided because of the relative nature of such terminology, 
the confusion surrounding it, and the danger of its misuse. 

Data measured at or near the limit of detection have two 
problems. The 'oncertainr/ can approach and even equal the 
reported value. Furthermore, contTrmation of the species 
reported is virtually impossible: hence the identification must 
depend solely on the selectivity of the methodology and 
knowledge of the absence of possible interferents. These 
problems diminish when measurable amounts of anai>te3 are 
present. .Accordingly, quantitative interpretation, deci- 
sion-making, and regulatory actions should be limited 
to data at or above the limit of quantitation. 

It must be emphasized that the LOD and LOQ are not 
intrinsic constants of the methodology but depend upon the 
precision attainable by a Laboratory when using it. which can 
be very diverse. Attainable precision can also vary according 
to the matrix analyred. Published values of LODs must be 
considered only as typical Each laboratory reporting data 
must evaluate its own precision and estimate its own LOD 
and LOQ values when they are important aspects of the data. 

.Anal>ticai chemists must alwa>-s emphasise to the 
public that the single most important charac- 
teristic of any result obtained from one or more 
analytical measurements is an adequate state- 
ment of its uncertainty interval [14). Lawyers 
usually attempt to dispense with uncertainty' and try 
to obtain unequivocal statements; therefore, an un- 
certainty Interval must be clearly derlned Ln cases 
Lnvolving Litigation and/or enforcement proceedings. 
Otherwise, a value of "l.OOl without a specified un- 
certainty, for example, may be viewed as legally ex- 
ceeding a permissible level of L 

DOCUME>rr.\TI0N .\ND REPORTI.VG 

Documentation of analytical measurements should 
provide information sufficient to support all claims 
mada for all the results. Documentation requires ail in- 
formation necessary to (1) trace t.he sample from :he :"eld to 
the final results, (2) descnbe the methodology osed, i3) de- 
scnba the corulrmatory evidence, (4) support statements about 
detectability, (5) descnbe the QA program and demonstrate 
adherence to it, and (6) support coruldence statements for tha 
data. 

Data, including all instrumental output, must be recorded 
in laboratory notebooks or other suitaoie media and should 
include complete sample documentation, transfers and 
movement, sample numcar, initial sample weight, extraction 
volume, :"nal weight and volume analysed, instrument re- 
sponse, sample calculations, and concentration of sample as 
appropriate. The ama tliat "spiked* samples and blanks wera 
r-ia relative to measurement of :ne inalyte is of utmost im- 
portance when measurements are attempted ciosa to tha Limit 



2-9 



•/ ANALYTICAL CHCMISTTIY. VOL. 5 J. NO. U. CECSJuSE?! :933 



of detection. Laboratory records ahould be retained in 
a permajient file for i length ot time set by government, 
other legal requirements, or the employia; institution, 
■whichever is longer, Bouud r.ocaoooks are preferred to 
loose ieai-r/pe nctebooks. 

HUectronic data handling, reduction, storage, ind '.rans- 
mission ivatems jreatly faciiiute dau handling and help 
minimize errors due :o misreading, faulty transcnption. or 
miacaictiLadons. However, the performance of '.he data 3>-5tem 
must b« tasted penodically with lcso«m data that .have already 
been calculated; this should be part of ".he quality assurance 
program. These casta must have sufficient iiversir/ and rigor 
to provide a reliable test of the data handling system. 

The analytical chemist is reaponsibie for fully de- 
scribing and interpreting the data and reporting it in 
aa appropriate manner. 

.Measurement results should be expressed so that their 
meaning is aoc distorted by the reporting process. The 
public at large -jaII .lot be able to recognize that LOQOO ag/kg 
and 10 Mg/ kg are the same. 

Data should be reported only to the number of significant 
figures consistent with their limits of iincertainty. When 
appropriate, the relationship between individiial sample 
values, blajiks, recovenes, and Qt.her supporting data should 
be showti- 

If possible, and within the scope of desired results, a number 
of measurements sufficient for statistical treatment should 
b« made. 'iVhen this is not the case, an espLanation is aec- 
essar/ including complete details of the aeatment of the data. 

Reports should make cleai" which results, if any, have 
b««o corrected for blank and recovery measurements. 
Any other limitations should also be noted. 

Conclusions as to whether a signal is detected, whether a 
positive signal is confirmed to be an inalyte, how much un- 
certainty is contributed by the sampling, and the risk of 
systematic error are beat made by those involved in the study 
and should be included in any report. Reports should 
contain sufricienc data and information so that users of 
the conclusions can understand the interpretations 
without having to make their own interpretations from 
raw data. 

If a published methodology is used, it should b« cited. 
.■\ay modification, as well as any new methodology or new 
approach Ui c.he making of measurements or interpreting the 
results, must be described in detail, including test results and 
details of Its validation. 

Raw data for each sample, along with reagent blanks, 
control, and 'spiked* samples should be suitably identified 
if included m a report. If average values are reported, an 
espressjon of the precision, including the number of 
measurements, must be included. Details should be wntten 
wita :ae standard deviation of the aean and presented 
showing t.hat the averaging process accounts for sample 
heterogeneity as wed as observed imprecision among replicate 
measurements of homogenized samples {!)• 

.\CKNOWLEDG.MENT 

Many individuals contributed *o the development oi this 
publication, and we are jratefui for the time and effort they 
took to ensure its completeness and accuracy. The .\mencan 
Chemical Society was aided tn this effort by a frant from the 
U.S. Savironmentai .Protection Agency. We especially ac- 



knowledge the following Lndividuals who reviewed its contents: 
Stuart C. 31ack [Environmental Protection .-V^enc;/ 'E?.^)l. 
Robert L. 3ooth EP.A.). '.Viiliam Budda 'E?\). Thomas 
Cairns [Food and Drug .Administration (rD.-\)l, '-Villiam 
Coakley iE?.\), W'jljam T. Donaldson 'E?.\). David "'reatone 
(FD.-\). ."red ."reeberg (The Procter i Gamble Co.i. Harr/ 
Hertz (National Bureau oi StandardsJ, '.VLUiam Hor^tz 
iFDA). Richard A. Hummei l Dow Chemical USAj, Ron Kagei 
(Dow Chemical Co.), Cliff -J. Kirc.imer (Roy F. Weston. lac.), 
James Lichtenburg (EP.\), James .McKLnney (National la- 
stiruta of Eavironmentai Health Sciences). David McLean 
(Association of Official .Analytical Chemists), Patr.ck 0'X«fc 
(New York State Department of Health), Lloyd Provost 
(Radian Corp.), Rosa K. Robeson (EP.A.), Lockhart B. Rogers 
(University of Georgia), John Santolucito lEP.^), Richard H. 
Wendt iThe Procter i Gamble Co.), Hugh '-Viaa i.EPA), M-nn 
Young (Veterans .Administration). 

LITERATURE CITED 

(1) ACS Co«tWTin»« on £ftvironfn«otai lmBrov«<rem. '3u«!»<lo«» .'or Cata 
AcQuoioon irti Cata Cuauty tvatuauon m Envronmanui C^a<m3lry' 
Anal. Ct>am 1980. S2 . 22*2-2248. 

(2) <aat\. l_ M. tn Oynamics. ijoojuf* and Maiart AiMisrrert a( Touc 
C^«nrcala"; rfagu*. 3.. :d.; Ann Artxx 3a«nc» P-joHarara; woourri. 
UA. !9a0; 30 •tl-»S. 

(3) Tayior. i. <„ 'Cuaulv Ajsuranc* a< Oamcai Usaaur smomj ' ^n^l. 
Ottn. H«1. 53. ;«a3A-;S35A. 

(A) [nnom. S. L.. S3. 'Ouaiiry Ajsuranc* '-access far Hsami 

Laocra(On»«"; Am«rcan P-.^oik: -'«ann tari'ca: .'*«w yotk. 1973: s 

19. 
(5) Hor«rti. W. Anal. C.'^am 1979. SI. 7*IA. 
(9) "H?A ^oooaad Guj<3«un«3 ?of ^^stanng P^atjode* n :fi« U.S."; Sr. 

wQnrTMHitai ?roi«cMn A^sncy. Waa/ungton. CC. Jun« 25, 1973: "S 

AO. Ho. 123. 30 23302-28923. 
(Tj 'Guailtv Aaauranc* Hanaeoak Icr Air Poilurton >.<«a5ur»<Ti»rit 

Srstanis": Voiurrs 1. =?A S00/9-7S-:fl5. ^.^arcfl 1978: Vaiume 2. 

£?A Sfl0/*-774327a. Mar '^T7: ina Vatumt 3. e?A i00/A.r7-32rs. 

Auqusi 197? 
(3) Msffola far Cie<racal Anarysis of iVaisr and Waatss, £?* J25yS-7*' 

003. 197*. 
(9) 'ASTV( >.4anuai on P'siantaoon o( Oata and Cjnffot Oiart Ana/yaia': 

ASTVfc PMadcienia. PA. 1979: 37? i«0. 

(10) Taylor. J. iC "<Vhat a Cuairty Aaauranca'; ^-ocs^Jlnijs. Confersncs 
an Quaiffy Aaat^arca af £nvircnrn«nta> W«aatir«<r«nt3; Souidar. 20, 
ASTVJ Au<pi»I 1933. 

(11) '^pA Praoosad Guidaiin** tor Saipatennq =«acjd»» ■n 7m U.S.": if>- 
vronmsntai ?T0iac::on iqarcy. .Vaarmi^tan, CC. Jufy '0. 1978: ?^ 
*3. Mo. 132. po 23638-297*1 

(121 Taywr. J. <. 'Vaiilaoon af AnaiyHcal ^arxxa' Anal. C1«m. 1983. 

55. SOOA-aOSA. 
(13) <iravf«r. C. J. zmrrsn. 3a 'tcrnoi i9«3. f7, i7Aa. 
(1*1 Seqara. L. 3.. « at. £ia.. '^»co<Tim«ocaoora 'er '.rrvwnv; T.a 3««e- 

:iMy and Acooiaomty or Araryncai C^•<T>'Cal Caia J>«d 'or P'jck 

>reotaa' Cl^am irig .v»wa 1912. 5i3i231. *-» 
(IS) i*v«ra. £. ^., Stmmona. 3. P , Staveni. ^. 0.; 3to«^ti. 0. L. 'Saxrct«« 

jnd 3arnourH3 'f^caoij/aa for -*arart3ua 'Vaata Straanra*; E-iv-rcn- 

mantai Protaciion A^ancy: Wasrm^^on. CC. .anuary 1930; EPA- 

SOO/2-30-013. 
(191 </aiocnv< 3.: Tayor. J. <. 'Zarvo<^q for 'C^•mlcal inalysia" Anal 

CS»m 19*1. S2. 32*A-93«A. 

(17) 3cn««3«. £. ? : ?^iiiB«rt. J. J.. Taylor. J. <. '^ufvay a( ii« Ceciirrsnca 
zl Viorcury. '-aao. and Ca<:mivirn n Tia /rfaamn^on. O.C. Arsa' ^ia- 
lonat 3u/aau a( Slarcaroa: «ainin<;ian. CC. Saoiarrear 197T- 
^laSIfl 7J.i*33. 

(18) ^atta. a, a.. 3^•rma. J., iara:^. '*-. '.»<"i3. 3. C, >*;»«<Tian. a. =. 
f^od^aor*. Z. *v . .Eas. "Anaryiia zt ^^aooca ^aaxluaa n -^xnan and 
Enwonmanial Samo'aa"; J. 3. £nv*onfR«n?at ^a(acv.on Agancy: 
Waamnqton. CC. .una '930: EPf-oOO/ 3-30-033. 

(19» <aa\. A. i_ .' ^»or 3x31 19S«. T. 279-290. 

(20) <oen. a. l. J. 1-«or ikn 19«». ."3, 231-298. 

(21) SaTwta. y. a ''3S "arce U 5.1 1983. .le Jr 

(22) <u. rt.. z± VSS icac ^.o/ lO i ) '9«». ■*» SOO. '/at. ' 

(23) >A»ony. T J. -'35 Sarc ^.01 lU S I H7S, -« '71. !09-i39. 
(2*) v*c?<*onav. j. C. 91 ii. 'n Envrcnr'anuj r'aafW Oiamiscry'. '.ic.Klrw 

''ay. J. C. £d.. Ann Ajoor Scianca P\tOi*jnari; <*^CQ\^n. ^AA, "330: 3 

|2S) Lorx}. a l_. /^•naioronar. J. 3 inal. Cfam m3. .'5. ri2A-r:*A. 
(29) Giaaar. J. A., roaril. 3. :_. <<<cKa«, 0. 3.: Cuava. 3. a.. Suoda. 'V u. 
ciTMron. So. .'aoVKX I9il. :J. i*2S-i*33. 



2-10 



Appendix 3 
Sample Preservation Methods and Recommended 
Holding Times 



3-1 



Aooendix 3 



SAMPLE PRESERVATION METHODS AND RECOMMENDED HOLDING TIMES 



TABLE 1. WATER PARAMETERS -^ ' ^ 





Vol. Req. 






Holding 
Time '■ "^ ' 


Measurement 


(mL) 


Container 


Preser^/ative 


Acidity 


100 


P, g3 


Cool, 4''C 


24 Hours 


Alkalinity 


100 


P, G 


Cool, 4''C 


24 Hours 


Arsenic 


100 


P, G 


HNO3 to pH<2 


6 Months 


Biological Oxygen 










D«na,nd ( BOD ) 


1,000 


P, G 


Cool, 4'C 


6 Hours'* 


Bromide 


100 


P, G 


Cool, 4''C 


24 Hours 


Chemical Oxygen 










Demand (COD) 


50 


?, G 


H2SO4 to pH<2 


7 Days 


Clhloride 


50 


P, G 


None Reqo 


7 Days 


Chlorine Req. 


50 


P, G 


Cool, 4*0 


24 Hours 


Color 


50 


P, G 


Cool, 4'C 


24 Hours 


Cyanides 


500 


P, G 


Cool, 4'C 


24 Hours 


o 






NaOH to pH 12 




Dissolved Oxygen 










Probe 


300 


G only 


Det. onsite 


No Holding 


Winkler 


300 


G only 


Fix onsite 


No Holding 


Fluoride 


300 


P, G ' 


Cool, 4''C 


7 Days 


Hardness 


100 


P, G 


Cool, 4'C 


7 Days 


Iodide 


100 


P, G 


Cool, 4'C 


24 Hours 


MBAS 


250 


P, G 


Cool, 4'C 


24 Hours 


Metals 










Dissolved 


200 


P, G 


Filter onsite 
HNO3 to pH<2 


6 months 


Suspended 






Filter onsite 


6 months 


Total 


100 




HNO3 to pH<2 


6 months 


Mercury 










Dissolved 


100 


P, G 


Filter 

HNO3 to pH<2 


38 Days 
(Glass) 
13 Days 
( Hard 
Plastic) 


Total 


100 


P, G 


HNO3 to pH<2 


38 days 
(Glass) 
13 Days 
(Hard 
Plastic) 



(continued) 



3-2 





TABLE 1 (continued) 








Vol. Req. 








Holding 


Measurement 


(mL) 


Container 


Preservative 


Time'^ 


Nitrogen 












Ammonia 


400 


P, G 


Cool, 
H2SO4 


4'C 

to pH<2 


24 Hours ^ 


Xjeldahl 


500 


P, G 


Cool, 
H2SO4 


4'C 

to pH<2 


24 Hours^ 


Nitrite 


100 


P, G 


Cool, 
H2SO4 


4°C 

to pH<2 


24 Hours^ 

1 


Nitrate 


SO 


P, G 


Cool, 


4°C 


24 Hours^ 


OTA 


50 


P, G 


Cool, 


■4'C 


24 Hours 


Oil & Grease 


1,000 


G only 


Cool, 
a2S04 


4'C 

to pH<2 


24 Hours 


Organic Carbon 


25 


P, G 


Cool, 

H2S04 


4'C 

to pH<2 


24 Hours 


pH 


25 


P, G 


Cool, 
Det. 


4'C 

3n site 


6 Hours(4) 


Phenolics 


500 


G only 


Cool, 
H3PO4 
1.0 g 


4'C 

to pH<4 

CUSO4/I 


24 Hours 


Phosphorus 












Orthophospate, 


50 


P, G 


Filter onsite 


24 Hours ^5) 


Dissolved 






Cool, 


4'C 




Hydrolyzable 


50 


P, G 


Cool, 
H2SO4 


4'C 

to pH<2 


24 Hours^^' 


Total 


50 


P, G 


Cool, 


4'C 


24 Hours <5> 


Total Dissolved 


50 


P, G 


Filter on site 


24 Hours' 5' 








Cool, 


4'C 




Residue 












Filterable 


100 


P, G 


Cool, 


4'C 


7 Days 


Non-Filterable 


100 


P, G 


Cool, 


4'C 


7 Days 


Total 


100 


P, G 


Cool, 


4'C 


7 Days 


Volatile 


100 


P, G 


Cool, 


4'C 


7 Days 


Settleabla Matter 


1,000 


P, G 


None Req. 


24 Hours 


Selenium 


50 


P, G 


HNO3 to pH<2 


6 Months 


Silica 


50 


P only 


Cool, 


4'C 


7 Days 


Specific 


100 


P, G 


Cool, 


4'C 


24 Hours'^' 


Conductance 












Sulfate 


50 


P, G 


Cool, 


4'C 


7 Days 


Sulfide 


50 


P, G 


2 ml zinc 
acetate 


24 Hours 


Sulfite 


50 


P, G 


Cool, 


4'C 


24 Hours 


Temperature 


1,000 


P, G 


Det. onsite 


No Holding 


Threshold Odor 


200 


G only 


Cool, 


4'C 


24 Hours 


Turbidity 


100 


?, G 


Cool , 


4'C 


7 Days 



3-3 



=.' Mora specific i.is-rucrions for preser'/ation and sampling are found 

with each procedure as detailed in this manual. A general discussion 
on sampling water and industrial wastewater may be found in 
ASTM, Part 31, pp. 68-73 (1973). 

i.' U.S. Environmental Protection Agency Office of Research and 

Development. Environmental Monitoring and Support Laboratory. 1979. 
Methods for Chemical Analyses of Water and Wastes. Cincinatti , Ohio. 
EPA-625/5-74-003. 

2/ Plastic or Glass. 

y If samples cannot be returned to the laboratory in less than 6 hours 
and holding time exceeds this limit, the final reported data should 
indicate the actual holding time. 

2.' Mercuric chloride may be used as an alternate preservative at a 
concentration of 40 mg/L, especially if a longer holding time is 
required. However, the use of mercuric chloride is discouraged 
whenever oossible. 



6/ 



7/ 



If the sample is stabilized by cooling, it should be warmed to 25 °C 
for reading, or temperature correction should be made and results 
reported at 25 °C. 

It has been shown that samples properly preserved may be held for 
extended periods beyond the recommended holding time. 



3-4 



Appendix 4 



Glossary 



4-1 



Appendix -J 



GLOSSAHY 



Agency: a.S. Bureau of Reclamation (USSR), etc. 

analytical or reagent blank: a blank used as a baseline for the analytical 
portion of a method. For example, a blank consisting of a sample 
from a batch of absorbing solution used for normal samples, but 
processed through the analytical system only, and used to adjust 
or correct routine anaxytical results. 

audit: a systematic check to determine the quality of operation of some 
function or activity. Audits may be of two basic types: 
( 1 ) performance audits in which quantitative data are independently 
obtained for comparison with routinely obtained data in a 
measurement system, or (2) system audits of a qualitative nature 
that consist of an onsite review of a laboratory's quality 
assurance system and physical facilities for sampling, calibration, 
and measurement. 

blank or sample blank: a sample of a carrying agent (gas, liquid, or 

solid) that is normally used to selectively capture a material of 
interest and that is subjected to the usual analytical or 
measurement process to establish a zero baseline or background 
value, which is used to adjust or correct routine analytical 
results. 



4-2 



confidence interval: a value interval that has a designated probability 
(the confidence coefficient) of including some defined parameter of 
the population. 

confidence limits: the outer boundaries of a confidence interval. 

contract: the legal instrument reflecting a relationship between the 
Federal Government and a State or local government or other 
recipient: (1) whenever the principal purpose of the instrument is 
the acquisition, by purchase, lease, or barter, of property or 
services for the direct benefit or use of the Federal Government; 
or (2) whenever an executive agency determines in a specific 
instance that the use of a type of procurement contract is 
appropriate. 

cooperative agreement: the legal instrument reflecting the relationship 
between the Federal Government and a State or local government or 
other recipient whenever: (1) the principal purpose of the 
relationship is the transfer of money, property, services, or 
anything of value to the State or local government or other 
recipient to accomplish a public purpose of support or stimulation 
authorized by Federal statute, rather than acquisition, by purchase, 
lease, or barter, of property or ser^/ices for the direct benefit or 
use of the Federal Government; and C2) substantial involvement is 
anticipated between the executive agency acting for the Federal 
Government and the State or Local government or other recipient 
duri-ng performance of the contemplated activity. 



data validation: a systematic effort to review data to identify any 
outliers or errors and thereby cause deletion or flagging of 
suspect values to assure the validity of the data to the user. 
This "screening" process nay be done by manual and/or computer 
methods , and it may utilize any consistent technique such as sample 
limits to screen out impossible values or complicated acceptable 
relationships of the data with other data. 

extramural review: technical and scientific review of a research or 

demonstration proposal by a qualified individual not an employee of 
the Environmental Protection Agency, such as an employee of industry 
or an academic institution. 

grant: the legal instrument reflecting the relationship between the 
Federal Government and a State or local government or other 
recipient in which: (1) the principal purpose of the relationship 
is the transfer of money, property, services, or anything of value 
to the State or local government or other recipient in order to 
accomplish a public purpose of support or stimulation authorized 
by Federal statue, rather than acquisition, by purchase, lease, or 
barter, of property or services for the direct benefit or use of 
the Federal Government; and C2) no substantial involvement is 
anticipated between the executive agency, acting for the Federal 
Government, and the State or local government or other recipient 
during performance of the contemplated activity. 



4-4 



grantee: any individual, agency, or entity which has been awarded a 
grant pursuant to grant regulations . 

internal review: technical and scientific review of a study or 

demonstration proposal by a qualified employee of the agency. 

measures of dispersion or variability: measures of the difference, 

scatter, or variability of values of a set of numbers. Measures 
of the dispersion or variability are the range, the standard 
deviation, the variance, and the coefficient of variation. 

performance audit: planned independent (duplicate) sample checks of 

actual output made on a random basis to arrive at a quantitative 
measure of the quality of the output. These independent checks are 
made by an auditor subsequent to the routine checks by a field 
technician or laboratory analyst. 

performance audit sample: a sample or sample concentrate (to be diluted 
to a specified volumfe before analysis) of known (to the EPA only) 
true value which has been statistically established by 
interlaboratory tests . These samples «ire commonly provided to 
laboratories to test analytical performance. Analytical results 
are reported to the sponsor agency/laboratory for evaluation. 

preaward survey: onsite inspection, review, and discussions with a 

prospective grantee or prospective contractor at his/her facilities . 
Discussions would normally include, but not be limited to, the 



proposed project plan, personnel, procedures, schedule, and 
facilities. Normally conducted after receipt of a "best and final" 
offer, but prior to final selection of a contractor. 

proficiency testing: special series of planned tests to determine the 
ability of field technicians or laboratory analysts who normally 
perform routine analyses. The results may be used for comparison 
against established criteria, or for relative comparisons among 
the data from a group of technician or analysts . 

qucility: the totality of feature and characteristics of a product or 

service that bears on its ability to satisfy a given purpose. For 
pollution measurement systems, the product is pollution measurement 
data, and the characteristics of major importance are accuracy , 
precision , and completeness . For monitoring systems, 
"completeness," or the amount of valid measurements obtained 
relative to the amount expected to have been obtained, is usually 
a very important rneasure of quality. The relative importance of 
accuracy, precision, and completeness depends upon particular 
purpose of the user. 

quality assurance: a system for integrating the quality planning, quality 
assessment, and quality improvement efforts of various groups in an 
organization to enable operations to meet user requirements at an 
economic level. In pollution measurement systems, quality assurance 
is concerned with all the activities that have an important effect 
on the quality of the pollution measurements, as well as the 



4-6 



establishment of methods and techniques to measure the quality of 
the pollution measurements . The more authoritative usages 
differentiate between "quality assurance" and "quality control," 
where quality control is "the system of activities to provide a 
quality product" and quality assurance "the system of activities to 
provide assurance that the quaulity control system is performing 
adequately. " 

quality assurance manual: an orderly assembly of management policies, 
objectives, principles, and general procedures by which an agency 
or laboratory outlines how it intends to produce quality data . 

quality assurance plan: an orderly assembly of detailed and specific 
procedures by which an agency or laboratory delineates how it 
produces quality data for a specific project or measurement method. 
A given agency or laboratory would have only one quality assurance 
manual, but would have a quality assurance plan for each of its 
projects or programs (group of projects using the same measurement 
methods ; for example , a laboratory service group might develop a 
plain by analytical instrument since the service is provided to a 
number of projects). 

quality control: the system of activities designed and implemented to 
provide a quality product. 



quality control ( internal ) : rhe routine activities and checks , such as 
periodic calibrations, duplicate analyses, use of spiked samples, 
etc., included in normal internal procedures to control the 
accuracy and precision of a measurement process. 

quality control ( external ) : the activities which are performed on an 

occasional basis, usually initiated and performed by persons outside 
normal routine operations , such as onsite system surveys , 
independent performance audits , interlaboratory comparisons , etc , , 
to assess the capability and performance of a measurement process . 

range; the difference between the maximum and minimum values of a set 

of values. When the number of values is small (i.ec, 12 or less), 
the range is a relatively sensitive (efficient) measure of 
variability. 

reliability (general): the ability of an item or system to perform a 
required function under stated conditions for a stated period of 
time. 

reliability (specific): the probability that an item will perform 

a required function under stated conditions for a stated period of 

time. 

representative sample: a sample taken to represent a lot or population 
as accurately and precisely as possible. A representative sample 
may be either a completely random sample or a stratified sample. 



4-3 



depending upon the objective of the sampling and the conceptual 
population for a given situation. 

spiked sample: a normal sample of material (gas, solid, or liquid) to 
which is added a known amount of some substance of interest. The 
extent of the spiking is unknown to those analyzing the sample. 
Spiked samples are used to check on the performance of a routine 
analysis or the recovery efficiency of a method. 

standard deviation: the squeire root of the variance of a set of values: 



j (Xi - X)2 

/ i-1 

V n - 1 
If the values represent a sample from a larger population; 



2 



2 (Xi - u) 

3 - /. i^l 

V N 



where u is the true arithmetic mean of the population. The property 
of the standard deviation that makes it most practically 
meaningful is that it is in the same units as the values of 
the set, and universal statistical tables for the normal 
(and other) distributions are expressed as a function of the 
standard deviation. Mathematically, the tables could just as 
easily be expressed as a function of the variance. 



standard reference material (SRM): a material produced in quantity, of 

which certain properties have been certified by the National Bureau 
of Standards (NBS) or other agencies to the extent possible to 
satisfy its intended use. The material should be in a matrix 
similar to actual samples to be measured by a measurement system 
or be used directly in preparing such a matrix. Intended uses 
include: (1) standardization of solutions, (2) calibration of 
equipment, and (3) monitoring the accuracy and precision of 
measurement systems. 

standard reference sample: a carefully prepared material produced from 
or compared against an SRM Cor other equally well characterized 
material) such that there is little loss of accuracy. The sample 
should have a matrix similar to actual samples used in the 
measurement system. These samples are intended for use primarily 
as reference standards to: (1) determine the precision and 
accuracy of measurement system, (2) evaluate calibration standards, 
and (3) evaluate quality control reference samples. They may be 
used "as is" or as a component of a calibration or quality control 
measurement system. Examples: an NBS-certif ied sulfur dioxide 
permeation device is cm SRM. When used in conjunction with an air 
dilution device, the resulting gas becomes an SRS. An NBS-certif ied 
nitric oxide gas is an SRM. When diluted with air, the resulting 
gas is an SRS . 



4-10 



standardization: a physical or mathematical adjustment or correction of 
a measurement system to make the measurements conform to 
predetermined values. The adjustments or corrections are usually 
based on a single-point calibration level. 

calibration standard: a standard prepared by the anlayst for the 
purpose of calibrating an instrument. Laboratory control 
standards are prepeured independently from calibration standards 
for most methods . 

detection limit: that number obtained by adding two standard 
deviations to the average value obtained for a series of 
reagent blanks that are analyzed over a long time period 
( several weeks or months ) . 

replicate analyses : the collection of two samples from the same 
fieldsite which are analyzed at different times but usually 
on the same day. 

duplicate amalyses: same sample analyzed twice. 

laboratory control standard: a standard of known concentration 
prepared by the analyst. 

reference standard: a solution obtained from an outside source 
having a known value and analyzed as a blind sample. 



4-11 



relative percent error for duplicate analyses: the difference 
between the measured concentration for the duplicate paar 
times 100 and divided by the average of the concentration. 

relative percent error for laboratory control standards: the 

difference between the measured value and the theoretically 
correct value times 100 and divided by the correct value. 

relative percent error of a reference sample analysis: the 

difference between the correct and measured values times 100 
and divided by. the correct concentration. 

StandcLTds Based Upon Osage 

calibration standard: a standard used to quantitate the 

relationship between the output of a sensor and a property to 
be measured. Calibration standards should be traceable to 
standard reference materials or primary standard. 

quality control reference sample (or working standard): a material 
used to assess the perfom^Lace of a measurement or portions 
thereof. It is intended primarily for routine intral aboratory 
use in maintaining control of accuracy and would be prepared 
from or traceable to a calibration standard. 



4-12 



standards Depending Upon "Purity" or Established Physical or Chemical 
Constants 

primary standard: a material having a Jtnown property that is 
suable, that can be accurately measured or derived from 
established physical or chemical constants, and that is readily 
reproducible. 

secondeiry standcird: a material having a property that is calibrated 
against a primary standard. 

satistical control chart (also Shewharr control chart): a graphical 

chart with statistical control limits and plotted values (usually 
in chronological order) of some measured parameter for a series of 
samples . Use of the charts provides a visueil display of the 
pattern of the data, enabling the early detection of time trends 
and shifts in level. For maximum usefulness in control, such charts 
should be plotted in a timely manner, i.e., as soon as the data are 
available. 

system audit: a systematic onsite qualitative review of facilities, 
equipment, training, procedures, recordkeeping, validation, and 
reporting aspects of total (quality assurance) system to arrive at 
a measure of the capability and ability of the system. Even though 
each element of the system audit is qualitative in nature, the 
evaluation of each element and the total may be quantified and 
scored on some subjective basis. 



4-13 



Tast Variability 



accuracy: the degree of agreement of a measurement (or an average 
of measurements of the same thing), X, with an accepted 
reference or true value, T, usually expressed as the difference 
between the two values, X-T, or the difference as a percentage 
of the reference or true value, 100(X-T)/T, and sometimes 
expressed as a ratio, X/T. 

bias: a systematic (consistent) error in test results. Bias can 
exist between test results and the true value (absolute bias, 
or lack of accuracy), or between results from different sources 
(relative bias). For example, if different laboratories 
analyze a homogeneous and stable blind sample, the relative 
biases among the laboratories would be measured by the differences 
existing among the results from the different laboratories. 
However, if the true value of the blind sample were known, the 
absolute bias or lack of accuracy from the true value would be 
known for each laboratory. 

precision: a measure of mutual agreement among individuals 

measurements of the same property, usually under prescribed 
similar conditions. Precision is most desirably expressed in 
terms of the standard deviation but can be expressed in terms 
of the variance, range, or other statistics. Various measures 
of precision exist depending upon the "prescribed similar 
conditions . " 



4-14 



replicates : repeated but independent determinations of the replicate 
samples , by the same analyst , at essentially the same time and 
same conditions . Care should be exercised in considering 
replicates of a portion of an analysis and replicates of a 
complete analysis. For example, duplicate titrations of the 
same digestion are not valid replicate analyses, although they 
may be valid replicate titrations. Replicates may be performed 
to any degree, e.g., duplicates, triplicates, etc. 

reproducibility: the precision, usually expressed as a standeird 
deviation, measuring the variability among results of 
measurements of the same sample at different laboratories. 

variance: mathematically, for a sample, the sum of squares of the 
differences between the individual values of a set and the 
arithmetic mean of the set, divided by one less than the number 
of values. 

replicate analyses: the collection of two samples from the same 
fieldsite which are analyzed at different times but usually 
on the same day. 

duplicate analyses: same sample analyzed twice. 

laboratory control standatrd: a standard of known concentration 
prepared by the analyst. 



4-15 



reference standard: a solution obtained from an outside source 
having a !<nown value and analyzed as a blind sample. 

relative percent error for duplicate analyses: the difference 
between the measured concentration for the duplicate pair 
times 100 and divided by the average of the concentration. 

relative percent error for laboratory control standards : the 

difference between the measured value and the theoretically 
correct value times 100 and divided by the correct value. 

relative percent error of a reference sample analysis: the 

difference between the correct and measured values times 100 
and divided by the correct concentration. 

Standards Based Opon Osage 

calibration standatrd: a standard used to quantitate the 

relationship between the output of a sensor and a property to 
be measured. Calibration standards should be traceable to 
standard reference tuaterials or primary standard. 

quality control reference sample (or working standard): a material 
used to assess the performance of a measurement or portions 
thereof . It is intended primarily for routine intra laboratory 
use in maintaining control of accuracy and would be prepared 
from or traceable to a calibration standard. 



4-16 



standards Depending Upon "Purity" or Established Physical or Chemical 
Constants 

primary standard: a material having a known property that is 
stable, that can be accurately measured or derived from 
established physical or chemical constants , and that is readily 
reproducible. 

secondary standard: a material having a property that is calibrated 
against a primary standard. 

satistical control chart (also Shewhart control chart): a graphical 

chart with statistical control limits and plotted values (usually 
in chronological order) of some measured parameter for a series of 
samples. Use of the charts provides a visual display of the' 
pattern of the data, enabling the early detection of time trends 
and shifts in level. For maximum usefulness in control, such charts 
should be plotted in a timely manner, i.e., as soon as the data are 
available. 

system audit: a systematic onsite qualitative review of facilities, 
. equipment, training, procedures, recordkeeping, validation, and 
reporting aspects of total (quality assurance) system to arrive at 
a measure of the capability and ability of the system. Even though 
each element of the system audit is qualitative in nature, the 
evaluation of each element and the total may be qucintifiad and 
scored on some subjective basis. 



I 



Test Variability 

accuracy: the degree of agreement of a measurement Cor an average 
of measurements of the same thing), X, with an accepted 
reference or true value, T, usually expressed as the difference 
between the two values, X-T, or the difference as a percentage 
of the reference or true value, 100(X-T)/T, and sometimes 
expressed as a ratio, X/T. 

bias: a systematic (consistent) error in test results. Bias cam 
exist between test results and the true value (absolute bias, 
or lack of accuracy), or between results from different sources 
(relative bias). For example, if different laboratories 
analyze a homogeneous and stable blind sample,- the relative 
biases among the laboratories would be measured by the differences 
existing among the results from the different laboratories. 
However, if the true value of the blind sample were known, the 
absolute bias or lack of accuracy from the true value would be 
kSaown for each laboratory. 

precision: a measure of mutual agreement among individuals 

measurements of the same property, usually under prescribed 
similar conditions. Precision is most desirably expressed in 
terms of the standard deviation but can be expressed in terms 
of the variance, range, or other statistics. Various measures 
of precision exist depending upon the "prescribed similar 
conditions - " 



4-18 



replicates : repeated but independent deter:ninations of the raplicata 
samoles, by the same analyst , at essentially the same time and 
same conditions . Care should be exercised in considering 
replicates of a portion of an einalysis and replicates of a 
complete analysis. For example, duplicate titrations of the 
same digestion are not valid replicate analyses , although they 
may be valid replicate titrations . Replicates may be performed 
to any degree, e.g., duplicates, triplicates, etc. 

reproducibility: the precision, usually expressed as a standaurd 
deviation, measuring the vatriability among results of 
measurements of the same sample at different laboratories. 

variance: mathematically, for a sample, the sum of squares of the 
differences between the individual values of a set and the 
arithmetic mean of the set, divided by one less than the number 
of '/alues. 



I 



4-19